On Thu, Mar 18, 2004 at 10:36:15AM -0800, Mark Knecht wrote:
OK, good points one and all. I hadn't considered
it in quite that way
before. To me (and I could be wrong) what you talk about above sounds
more like format conversion. I'm not a programmer so I could have this
messed up from a programmer's POV, but converting from a 24-bit ADC on
receive data to some internal float format and then later back to a
24-bit DAC on transmit is not what I think of as 'dithering'. I grant
you that there might be some form of dithering, in the sense of removing
low-order bits, taking place. Some ADC's insert noise in their
conversion at record time. However, I wouldn't have thought that either
apps or Jack is inserting noise into the process at all those steps are
they?
Floats have something like 25 bits worth of headroom when at 0dB's so when
converting down to 16bit you have to drop some bits. If you drop bits and
dont dither you get distortion in the output.
So yes, JACK adds noise and then truncates the integer representation
before sending it ALSA, the 16bit case. Its not turned on by default
though, for no good reason.
My POV in the previous statement was more about the
form of dithering we
use when outputting specifically to a CD, for instance. The lost bits at
the bottom problem and the fact that people can hear below the noise
floor of 16-bit audio. I don't think people can hear below the noise
floor of 24-bit audio, or at least my ears certainly can't.
You cant represent 24 bits of resolution electrically in analogue at all,
so no, you cant hear it until we get those firewire head-sockets :)
- Steve