[LAD] Plugin buffer size restrictions

David Robillard d at drobilla.net
Tue May 29 03:27:44 UTC 2012

On Tue, 2012-05-29 at 10:06 +1200, Jeff McClintock wrote:
> > I assume that computing parameter trajectories basically means
> > interpolating, and that inevitably introduces some latency.
> That's a key point. Interpolating any sampled signal introduces latency.
> > let the host pass a limited number of future parameter samples at each
> > run() (could be negotiated at instantiation time), so that the plugin
> > doesn't have to add latency to the audio streams in any case. Would be
> > only supported by "offline hosts". If the block sizes are variable,
> > future block sizes should be passed as well (argh?). But I don't know
> > if this really makes sense or has downsides... ideas, folks?
> I *really* hate this idea.
> I play my MIDI keyboard into my DAW, perhaps while using my Mod-Wheel, or
> artistically using the filter-cuttoff parameter...I hit record...stop..
> Then I push 'offline render'.
>  You would say - shift all my parameter events earlier in time and render
> the result to disk?  It's going to sound different. The timing will be
> wrong.  A DAW is like a tape recorder.  Playback or offline rendering should
> result in an identical performance surely?.
>   Why are you selectively shifting some musical events in time but not
> others, why not note-ons too? 
> You can't provide live MIDI playing 'in advance', you can't prove parameter
> updates in advance, just like you can't provide live audio in advance. If
> the plugin wants 'future' data to interpolate stuff, it needs to introduce
> latency. A good host will compensate for latency if the plugin API supports
> that.
>   Parameters aren't special, they don't require any different handling than
> MIDI.  What's the difference between a MIDI controller tweaking the filter
> cuttoff, or directly tweaking the parameter? Nothing. They both need
> smoothing, they both need interpolating, they both will have latency. Don't
> overcomplicate it.

Well, there is no reason to add latency to notes, whereas (many)
parameters are continuous things you want to interpolate, in which case
they act in a bit of a slewed manner.  They are different because one is
interpolated and the other is not.  In many cases a controller is coming
from somewhere with inherently sloppy timing anyway (GUI sliders,
anything through a ringbuffer).  Only the host knows this stuff, so only
the host can do the right thing.

In fact, if controls were just numbers, and there was a global mandate
that they actually refer to the past (which is the alternative to future
values), then in that situation you have one delayed thing - controls -
and notes still wouldn't be (MIDI events come in a sample accurate
buffer with timestamps directly referring to audio time, which is a
given since anything else would be completely insane).  Only by keeping
the plugin API synchronous and letting the host provide future values
can you get what you want, fons get his bandlimited interpolations, I
get my sample accurate control, and automated parameters can be
perfectly sample accurate.

Let me make clear one absolute requirement: adding latency to *all*
control/notes/etc of plugins in all situations is not acceptable.

I do however absolutely agree that accurate reproduction of performances
is also a requirement.

These two requirements imply, as you say, that the latency of different
things (e.g. notes and controls) can not be different.

The only practical way to achieve this is to make the host deal with it,
in whatever way is correct for the given scenario, and presenting the
plugin with what it needs to render when it needs it.

That is essentially the ideal I am striving for: when the plugin is told
to render frames 1024..2048, it should be provided with all the
information necessary to do so.

Anything else is way, way more complicated than it needs to be.

All the questions about how to correctly do control latency and accurate
reproduction and so on are valid questions that need satisfactory
answers, but I don't think the actual plugin API should be polluted with
a ton of arcane rules about it.  It is better to simply give plugins
everything they need when they need it, and leave doing all that stuff
right to the host - particularly since there is more than one "right".


More information about the Linux-audio-dev mailing list