On Sat, Apr 04, 2009 at 02:23:24PM -0700, Kevin Cosgrove wrote:
On 4 April 2009 at 23:07, Julien Claassen
<julien(a)c-lab.de> wrote:
Does it make much difference if you record in
48kHz or 96kHz
if you finally get down to 44.1kHz output for the public? I
mean realistically, not just in theory viewed on some analyzer.
I've read a few things in "Recording" magazine over the last few
years which indicate that bit depth is much more important than bit
rate when it comes to compression. If you start with 16-bit audio,
then compress it, you'll end up with the signal compressed to less
than 16-bits, and then you'll add noise to fill up the remaining
bits. They recommend going with more bits. I record at 24-bits.
I don't see any usefulness in recording at a higher bit rate, when my
target is 44.1kHz. Those same articles didn't say higher bit rates
were bad. But, they did say that extra bits are much better than
faster bits, at least when it comes to compression issues.
Yes, bit depth is very important and easy to notice, specially when the
recorded sound has loud and soft areas. Quieter areas are captured with
little detail in 16bit, and if you plan to compress, amplify or edit the
result, then the artifacts produced by the low bit depth become
noticeable.
Sample rate is a matter of taste, and harder to notice. I've spend a
lot of time lately recording at 96Khz with a Delta66 sound card and
using Sennheiser headphones. Until then, I never noticed the difference,
but I think that my ears "learn" during the process so it became very
easy for me to identify sounds recorded at 44.1kHz and sounds recorded
at 96kHz. I made test listening at strings. Higher pitched harmonics
sounded perceptually "damaged", like hearing a heavily compressed MP3
file, it is hard to describe, probably other persons in this list with
more technical knowledge could provide a more accurate description of the
differences; on the other side some people seem to be unable to notice
it, and a friend was actually unable to hear those harmonics no matter
the sample rate (maybe starting at C10...). Other "guinea pig" (a friend
known to have a extremely well trained ear) reported that differences
were huge and he even claimed that sounds at 44.1kHz are not realistic
enough and sound like "a poor imitation of the real instrument". He
correctly identified 100% of my 44.1kHz recordings (either directly
recorded at 44.1kHz or downsampled from 96kHz) by just listening the
first two seconds of audio, really impressive since I need almost one
minute to start hearing *very light* differences. Most funny thing is
that he does not seem to be annoyed by noise, bad equalization or poor
acoustics in the recording room. Loosing some resolution at higher
frequencies is a mayor annoyance for him. So I guess there is not a
single and correct answer, probably better try yourself and choose the
sample rate that gives a good result for you. Probably not a big issue
unless you want to create an audiophile-proof recording, but if storage
and processing power is not a problem, just use the better your hardware
allows.