On Tue, May 29, 2012 at 10:06:42AM +1200, Jeff McClintock wrote:
You can't provide live MIDI playing 'in
advance', you can't prove parameter
updates in advance, just like you can't provide live audio in advance. If
the plugin wants 'future' data to interpolate stuff, it needs to introduce
latency. A good host will compensate for latency if the plugin API supports
that.
Agreed. For 'live' input, all the host needs to do is replay
the control data at the same time w.r.t. the audio. Whatever
delay there is will be the same as when the user recorded
the control data, and the user will anticipate his control
input if necessary, just as he would do with note events.
Things are different if the control data was not generated
'live' but e.g. created in an graphical automation editor
using a displayed recorded waveform - the one the plugin
will operate on - as the time reference. In that case
users will not expect an offset of control w.r.t. audio.
But otoh, even such data will probably be edited by 'trial
and error' - by listening to the result instead of trusting
the graphical input blindly - which means that even in this
case the user will somehow compensate for the delay (if it
is short enough). This is basically what I meant in my
previous reply to Dave - the delay doesn't matter if you
can keep it within reasonable bounds.
Trading audio latency for control latency as you suggest is
a possibility. But clearly a plugin should not do that if
its control input in 'live' - the latency is unavoidable
anyway in that case. Since a plugin can't be expected to
know the bigger picture, it should never do this...
Ciao,
--
FA
A world of exhaustive, reliable metadata would be an utopia.
It's also a pipe-dream, founded on self-delusion, nerd hubris
and hysterically inflated market opportunities. (Cory Doctorow)