[linux-audio-dev] +momentary, consolidated (ladspa.h.diff)

Tim Goetze tim at quitte.de
Mon Mar 8 15:18:11 UTC 2004


[Tom Szilagyi]
>Fixed latency is not a big issue because a plugin can report the maximum
>latency it could ever reach, and if the actual latency is lower it can
>artificially delay its output internally. But i think this is ugly,
>particularly because of the fact that we already have a working setup
>with the "latency" control output which allows for changing latency as
>well -- and this does make sense, since latency may depend on sampling
>rate in a large number of cases, and since sample rate may vary almost
>an order of magnitude, i think it would be suboptimal to impose a
>needless delay on the whole processing line.

ok, after going over tap_limiter.c in detail i see the point of
communicating latency information to the host at runtime (and neither
a dedicated descriptor member, nor RDF will ever succeed in trying to
accommodate the behaviour).

using a dedicated CONTROL | OUTPUT port for this purpose is indeed
a very sensible option.

consequently, all we need do is document the "latency" port in
ladspa.h i think.

i would like to make life easier for host authors by requiring plugins
to never vary latency after instantiation (recall that instantiation
sets the sample rate, after which it remains fixed).

since the latency number is fixed throughout the plugin lifecycle, the
tap_limiter behaviour of writing to the 'latency' port once, namely as
soon as it is connected, seems the most sensible option.

-

a much cleaner way to do this would be a 'get_latency()' method,
required to return the same figure throughout the plugin lifecycle.
we can still do that and not require the 'latency' port hack.

what's your thought on this?

amicalement,

tim



More information about the Linux-audio-dev mailing list