I assume that computing parameter trajectories
basically means
interpolating, and that inevitably introduces some latency.
That's a key point. Interpolating any sampled signal introduces latency.
let the host pass a limited number of future parameter
samples at each
run() (could be negotiated at instantiation time), so that the plugin
doesn't have to add latency to the audio streams in any case. Would be
only supported by "offline hosts". If the block sizes are variable,
future block sizes should be passed as well (argh?). But I don't know
if this really makes sense or has downsides... ideas, folks?
I *really* hate this idea.
I play my MIDI keyboard into my DAW, perhaps while using my Mod-Wheel, or
artistically using the filter-cuttoff parameter...I hit record...stop..
Then I push 'offline render'.
You would say - shift all my parameter events earlier in time and render
the result to disk? It's going to sound different. The timing will be
wrong. A DAW is like a tape recorder. Playback or offline rendering should
result in an identical performance surely?.
Why are you selectively shifting some musical events in time but not
others, why not note-ons too?
You can't provide live MIDI playing 'in advance', you can't prove
parameter
updates in advance, just like you can't provide live audio in advance. If
the plugin wants 'future' data to interpolate stuff, it needs to introduce
latency. A good host will compensate for latency if the plugin API supports
that.
Parameters aren't special, they don't require any different handling than
MIDI. What's the difference between a MIDI controller tweaking the filter
cuttoff, or directly tweaking the parameter? Nothing. They both need
smoothing, they both need interpolating, they both will have latency. Don't
overcomplicate it.
Jeff