Date: Sun, 26 Feb 2006 16:08:21 -0800
From: "Maluvia" <terakuma(a)imbris.net>
Subject: [linux-audio-user] Re: Companies Refusing to Release/Permit
<snip>
That's why
we are stuck at 24bit
Well thank you for a scientific explanation of this ceiling.
I guess, then, that *real* 24-bit resolution, or something very close to
it, would yield what I am looking for - if it can be achieved.
Are you sure that's what you are looking for?
Recording is about creating illusions, not
fidelity. If you record an
acoustic guitar in a totally dead room with the flattest most accurate
mic and pre, in to best a/ds in the world, it sounds... ok.
Put some reverb and top end on it, a little compression, perhaps add a
little distortion with an aural exiter, or recording to tape, and people
will say 'wow, what an amazing fidelity guitar recording!' :)
I agree with this to a certain extent, but the quality of the effects - or
the final signal after the effects are added, is affected by the fidelity
of the original signal.
There is a huge difference in our guitar sound put through an 8-bit Zoom
processer, an 18-bit Alesis Q2, a 20-bit Alesis Q20, and a Behringer
"24"-bit V-Verb.
Ah, but there are so many other differences between those effects than
their bit depths. Let me guess, they sound better in the chronological
order they were released in? The amount of DSP available and the quality
of the code has changed too...
I think it is about both - using a high-fidelity
acoustic signal blended
with creative, high-quality effects to create a beautiful auditory
experience.
I agree. Though, fidelity does not always equal sounding better. That's
why we don't use ultra flat measurement mics to record everything.
Bullshit. If you can hear the difference between a
20 bit converter
and a >20 bit one, what you hear is the difference between two
converters, regardless of the number of bits they use.
And you can prove this?
I would assume, that if "24-bit" converters are really only 20-21 bits,
then a so-called "20-bit" converter is likely <<20 bit.
I maintain that I *can* hear bit-depth difference.
Are you perhaps suggesting that there exists some bit-depth threshold w/re
to human hearing?
What do you base your comment on?
Even 16 bits correctly dithered is better than 24
tracks on a 2 inch tape.
Again, what do you base this on?
Recording what?
"Correctly dithered" - and you would maintain that there is some objective
standard as to what constitutes this?
I can hear the distortion of the audio signal created by dithering, just as
I can hear the distortion of the audio signal created by Dolby - and I
don't like it.
Paradoxically, the only way to avoid digital artifacts is by the use of
dithering. This can be proved. There is such thing as correct dithering.
If you think existing digital technology can already match or exceed the
audio fidelity of a 24-track reel-to-reel recorder, I would very much like
to know what it is, and where it is available - and I would like to hear
it.
Fidelity is a measure of how closely the signal you get out of your
recorder matches what you put into it.
You still probably won't believe me, but the fidelity of a £100 card
like an audiophile 24/96 will be greater than that of 24 track 2".
The audiophile will have a lower noise floor, better linearity, no
scrape flutter or wow, much lower cross talk between channels, much less
IMD, wider frequency response (and a more solid bass end).... but it
might not sound as 'good'.
I don't know if you have ever worked with tape, but you really did have
to be so much more careful than digital about getting a good level to
cut down noise, putting non critical tracks on 1 and 24 as they always
got a bit knackered on reels and transport, recording at lower levels if
the source has lots of hf content, line up and bias.... all this stuff
was a total pain in the arse. Most everyone used some kind of noise
reduction, unless they were pushing the tape really hard, in which case
the distortion figures are laughable compared to digital.
-Maluvia