On Mon, Dec 19, 2011 at 16:20, Fons Adriaensen <fons(a)linuxaudio.org> wrote:
On Mon, Dec 19, 2011 at 03:21:19PM +0100, Johan
Herland wrote:
Yeah. After some thinking it's now obvious to
me why I shouldn't
upsample (as long as the rest of the pipeline can do the sampling
rates I throw at it). However, I do want to use as many bits as
possible in processing, to make sure that I don't loose detail by
shifting bits into oblivion.
This is a non-problem. 24 bits gives you a S/N ratio better than
140 dB. Now assume that you adjust the volume of you power amps
such that -20 dB digital corresponds to +100 dB SPL. This is pretty
loud and you still have 20 dB headroom for peaks. That means that
the digital noise floor of your system corresponds to -20 dB SPL,
that is 20 dB below the hearing threshold and at least 40 dB
below the ambient noise level of any normal living room. So
won't hear it. Even if a low-level signal would use only 8 bits
or so, you will not hear any distortion as a result of that.
Thanks for quantifying this for me. Between the crazy audiophile
forums and my lack of experience in this field, it's not always easy
to keep track of how much is good enough, and what is just crazy
over-engineering. :)
I don't
see how the _entire_ pipeline can run off a single external
clock. Say, I have a simple setup: A CD player connected to the audio
PC doing a simple digital volume control, and then outputting the
signal to a (Power)DAC. Obviously there will be some (small but still
non-zero) delay between the audio PCs input signal and its output
signal. How can both signals run from the same clock?
The will simply be a fixed number of samples delay between input
and output. The thing that matters is that the sample frequencies
are exactly the same. If they are not you have to resample.
I see. And here using the same number of kHz on both sides of the PC
is not necessarily good enough; they actually have to be the very
_same_ clock, so that there's no chance of drifting.
Or does the
audio PC sync the output to a _later_ pulse form the clock
generator (i.e. all that matters is that the signal is synced with _a_
clock pulse, not necessarily _the_same_ clock pulse with which it was
originally transmitted)?
Indeed, that is what happens.
But if so, couldn't I have one clock between
the CD player and audio
PC, and a different clock between the PS and the DAC?
(I assume you mean _PC_ and DAC, or PC and digital amp).
In that case the PC has to do sample rate conversion. It will also
considerably complicate your SW.
Ok, I think I see now. Basically in order to prevent sample rate
conversion or extensive buffering (to compensate for clock drift), you
must make sure that the input and output runs off the very same clock
(with a whole constant number of clock cycles worth of audio being
processed at any time in the audio PC).
And is
self-clocking somehow inferior to an external clock? If an
external clock is better, how come ethernet/USB/firewire and all other
digital communication protocols run without external clocks? Sorry for
being thick, but I haven't worked with these things before...
In most digital formats the clock can be regenerated from the
digital signal. There's no quality loss involved in that - clock
jitter at a digital input doesn't matter as long as the clock is
good enough to recover the data without error. Professional audio
installations use a separate clock because 1) that is more reliable
when long cables are used, and 2), it makes the synchronisation
independent of the signal routing which in e.g. a studio isn't
just a simple linear chain as it would be in a domestic player
setup, and usually it isn't fixed either.
Understood. In my case, I'll have a fairly linear and fixed setup, and
my cable runs will not exceed 10-15 meters (which is towards the upper
end of what SPDIF can handle, IINM), so self-clocking should hopefully
be sufficient.
but this also means that if you switch from blue-ray
player a to blue-ray
player b, your sound card must change its clocking source from input a to
input b. which might or might not cause an audible click or thump.
Can't you "fix" this by quickly muting the signal, then switch
sources, and unmute after you've "locked" onto the new signal? If the
switch is software-controlled (via RS-232) wouldn't that be fairly
simple to do?
Yes, you could mute the amps for a short time. Anyway, when the PC
input clock is switched, the output clock will follow, your amps
will detect that and probably mute automatically until they are
resynced. That is at least what I'd expect from something costing
$6000.
Yeah, I was hoping to keep it below that. At least until I can prove
to myself that the potential audible improvement will be worth it. I'm
basically trying to keep the cost down while I experiment, and once I
get more familiar and comfortable with the system as a whole, I can
make a more qualified decision on where to put the bigger money.
I don't
know whether it's cheaper/easier to
convert AES/EBU or ADAT to SPDIF coax/toslink...
AES/EBU and SPDIF are (almost) the same. The main differences
are in the electrical signal level and impedance. There are some
small coding differences as well, but in general an SPDIF input
will work with an AES signal, and an AES input *may* work with
an SPDIF signal provided the signal level is high enough, but
don't count on that.
Ok, so I might be able to convert from AES/EBU to SPDIF using a simple
(maybe even passive) circuit?
That seems like it'd be much cheaper than converting ADAT to SPDIF.
And it would even give me the option of upgrading to higher-end
AES/EBU amps along the way. Not to mention that AES/EBU will do longer
cable runs than SPDIF.
Then again,
there are some digital amps that take USB input (e.g. the
miniSTREAMER/miniAMP combo <URL:
In that case the PC's output sample rate is set by the amplifier.
Which means you can connect only one such amplifier unless they
are synchronised with external clocks (which they probably can't
do). And even if you can get them synced the PC will still have
to sample rate conversion if it plays from an external source.
So I'd say 'don't go that way'.
Thanks. That makes sense now that I have a grasp on the clocking business.
Thanks a lot for your help. I really appreciate it!
Have fun! :)
...Johan
--
Johan Herland, <jherland(a)gmail.com>
www.herland.net