[linux-audio-dev] Paper on dynamic range compression
Alfons Adriaensen
fons.adriaensen at alcatelaleniaspace.com
Thu Oct 5 08:45:54 UTC 2006
On Wed, Oct 04, 2006 at 10:51:08PM -0500, Andres Cabrera wrote:
> I've written a paper analyzing the characteristics of some software
> dynamic range compressors. The interesting part concerning linux, is
> that all my results show jaggedness on the gain reduction curves. I did
> the tests using Ardour, Audacity and Rezound with the same results,
> which points to something strange in some part of the process.
Or in your measurement methods which are ill-defined, making it all but
impossible to correctly interpret some of the results.
- How do you measure gain ? By comparing single input/output samples ?
It seems so, otherwise how do you obtain a gain vs. time curve at
50 Hz with sub-millisecond resolution (Fig. 3) ?
This will be invalid in many cases, for example if the plugin includes
audio delay to achieve 'zero latency', as you suggest some of them do.
- This delay is the first thing that should be measured. Without
this information it is impossible to evaluate the results.
- How on earth do can you define the level of a white noise signal
by a peak value ?
- What is a square wave at 0dB FS ? Positive and negative samples
at the maximum amplitude ? That does no correspond to a analog
square wave signal.
- How do you expect to measure distortion using square waves ?
--
FA
Lascia la spina, cogli la rosa.
More information about the Linux-audio-dev
mailing list