On Thu, Jul 15, 2010 at 01:14:45AM +0200, Ralf Mardorf wrote:
Apart from
that, it remains to be seen if *real* timing errors of
+/- 2 ms do 'destroy the groove'. To test this, make the same
recording
- without jitter,
- with 1 ms jitter,
- with 2 ms jitter,
- with 3 ms jitter.
and check if listeners are able to identify which is which,
or at least to put them into order.
I know very gifted musicians who do like me and they always 'preach'
that I should stop using modern computers and I don't know much averaged
people. So the listeners in my flat for sure would be able to hear even
failure that I'm unable to hear.
I'm sure they would be sensitive to bad timing. But that's not
the question. Would they be able to identify the recordings listed
above ? Until you try it you won't know, and your claim that 2 ms
of jitter 'destroys the groove' is pure conjecture.
Anyway. this crowd shouldn't be the benchmark for
good music. Am I
wrong?
It's not about what 'good music' is. The question is if midi jitter
of 2 ms does degrade the quality of a rendering.
Ciao,
--
Je veux que la mort me trouve plantant mes choux, mais
nonchalant d’elle, et encore plus de mon jardin imparfait.
(Michel de Montaigne)