On Fri, Feb 17, 2012 at 12:32:01PM -0500, gene heskett wrote:
[bolometers]
Either of those methods costs 500-5000 USD to accomplish.
The average house & garden multimeter is indeed completely useless
for measuring anything audio. OTOH
* There are quite good handheld audio RMS meters which don't
cost a fortune (but they are in the higher price range).
They use analog integrated circuits which can be quite
accurate - at least for normal audio use. They are not
laboratory standards of course.
* Any pro-quality audio card, once calibrated against a known
signal and combined with some simple software will make a
near-perfect RMS meter *for the audio band* and as long as
you don't drive it into clipping.
[cable impedance]
Why? The common two wire & foil shielded audio cable, used in broadcast
and studio facilities in miles per studio quantities, actually has an
impedance in the 60 ohm area! Feed it with a 600 ohm source and 300 feet
of cable later its rolled off like a Ma Bell telephone circuit. Your audio
DA's, to drive that, need to source terminate at 30 ohms per wire, from a
very low impedance amplifier.
The concept of cable impedance makes sense only if the lenght
becomes a non-trivial fraction of wavelength. For audio that
means that for anything shorter than a few hundred meters it's
only capacitance that matters. And yes you need hefty line
drivers and low output impedance to push 20 kHz, +20 dBu on
a long line. Which is one of the reasons why real pro quality
analog audio remains expensive.
Ciao,
--
FA
Vor uns liegt ein weites Tal, die Sonne scheint - ein Glitzerstrahl.