On Tue, May 04, 2004 at 12:08:23PM -0700, Mark Knecht wrote:
Paul Winkler wrote:
The kind of brick-wall limiting and
gain-stage-hammering that
people do to get "loud" mixes introduces significant harmonic
distortion. That's "where". I can't tell you "what", I
don't
know the characteristics of the distortion technically.
As for "why"...
OK then I'm going to 'check out' at this point. None of this thread
makes any sense in my experience. It is completely inconsistent with my
experience which is mostly mixing pop/rock stuff. It seems to me this
has drifted very far into the Twilight Zone.
Certainly brick wall limiters change the harmonic mixtures. Whether they
'add significant harmonic distortion' should be measureable,
It's beyond measurable, it's plainly visible! Look at the samples
posted with that Rush article. They've got the kick drum practically
doing square waves.
but whether
that harmonic distortion has anything to do with being 'favorable' I'll
wait for someone with more time and interest to discover.
Of course. It's all a taste thing.
FYI, I'm not saying the hyper-loud-and-distorted trend is good.
I think a little soft clipping goes a long way, and the current
trend is far, far overboard.
Far more likely, in my mind, is that the root cause of
this is just
basic Fletcher-Munson stuff taking over at a low level and then
inexperienced listeners and musicians letting it run wild. Inexperienced
listeners often report that they like one mix over another when in fact
the two mixes are identical and the preferred is 1 or 2db louder, at
least over certain listening ranges. There is absolutely no significant
harmonic anything going on when that happens.
Sure, I've heard that story many times, but it doesn't discount
the harmonic content issue. I suspect you'd have similar results if you
had people compare two otherwise identical mixes in which you only
goosed 5-10kHz by a couple dB.
There are tricks the ear
is playing on them when they listen at low levels vs. louder levels. I'm
always surprised (well, not always anymore, but often) when I
recalibrate my environment with my SPL meter and remember how loud 85dbA
is. I tend to listen at much lower levels but I think that most people
do not, and most engineers mix at higher levels than I do also. With
this in mind it's no suprixe that a song would sound better played at
the level it was mixed.
Yeah, but I wasn't considering monitor levels during mixdown
at all. Rock music is loud when it's performed, but relatively quiet
when it's played back at home. IMO a purely "accurate" approach to
recording (insofar as such a thing is possible) doesn't really
carry the intended musical experience at low playback volume.
For one example, drum attacks sound really different.
My hypothesis is that (lack of) in-ear distortion is one factor in this
perceived difference.
If I record a
violin correctly, why do people think they need more harmonics?
Wrong choice of instrument ;-) Classical engineers tend to
lean toward neutrality and accuracy in their recordings.
Why does a violin equate to classical recording? I saw Laurie Anderson
just last evening, but of course that wasn't really a violin. ;-)
Yeah, I got sloppy there. And here I was just listening to some
bluegrass yesterday...
All I meant is, it's a pretty unusual candidate for deliberate
harmonic distortion (velvet underground notwithstanding).
Rock drums are a whole different story.
Just a
strange hearing thing that tends to favor having them? Possible, but
also possibly black magic. I guess I'm too 'old school'! ;-)
I suppose you use only measurement microphones too ;-)
OK, insult taken, but that's OK. I wasn't getting personal. Why are you?
Eep! Sorry Mark, I honestly didn't mean that as an insult.
Care to explain where I stepped out of bound? I really didn't
mean to.
What I was trying to say in my clunky way is...
most pop/rock engineers use lots of different microphones
with subtly and radically different frequency & transient response;
few of them anywhere near flat.
But it's relatively rare to use B&K measurement mics except for, well,
measurement. So it's clear that pure accuracy is not the whole ballgame;
in fact, carefully chosen INaccuracy seems to be a big part of it.
I don't think this is black magic at all, I think it's just an
evolved aesthetic. It's just like overdriving a guitar amplifier,
really... those amps weren't originally built with it in mind,
but lots of people sure like it.
Which doesn't really shed any light on why the mastering aesthetic
has now evolved to an extreme that *nobody* (afaict) really likes.
OK, I give up too :-\
--
Paul Winkler
http://www.slinkp.com