[linux-audio-dev] 8bit sound wav playing to a 16bit sound card...

Derrick derrick at logicalsi.com
Fri Jun 13 08:57:01 UTC 2003

> I think someone else (Paul?) hit it on the head.  I fyou load 2 8bit
> samples and pass them to a soundcard that is expecting 16-bit samples,
> you'll get a LOT of garbage - like white noise - and it will be 1/2 as long
> as your input sample.
> See, each pair of 8 bits will become a 16 bit sample.
> input 8 8bit samples:
> dec:  0    64   127  64   0    -64  -127 -64
> hex:  0x00 0x40 0x7f 0x40 0x00 0xc0 0x81 0xc0
> read as 4 16bit samples (ignore endianness):
> hex:  0x0040 0x7f40 0x00c0 0x81c0
> dec:  64     32576  192    -32320
> Notice how the waveforms don't resemble each other at all.
> Clearer, now?

I See.. So if I wanted to convert to a 16bit, how would you recommend I do 
this? It would seam I would need some type of Filler.... err. white noise or 
just blank noise to fill in the extra 8bits. I guess I could convert every 
bit read from the file from an 8bit (unsigned char) to a 16bit (signed short) 
then write it to the /dev/dsp?

Bare with me here, I'm not a verteran.. So I would have to convert the 
unsigned char to a signed char, then to a signed short... So Not knowing how 
the conversion is done in the OS, I'm assuming that the resulting signed 
short would be padded with 'off' bits. which would come out as silence 
correct? ( but it's in the same sample, so you really wouldn't hear the 
silence )


It looked like something resembling white marble, which was
probably what it was: something resembling white marble.
		-- Douglas Adams, "The Hitchhikers Guide to the Galaxy"

More information about the Linux-audio-dev mailing list