[LAD] Portable user interfaces for LV2 plugins.

Jeff McClintock jef at synthedit.com
Thu Mar 3 19:56:40 UTC 2011


> From: Paul Davis <paul at linuxaudiosystems.com>
> Subject: Re: [LAD] Portable user interfaces for LV2 plugins.

> VST3 allows the GUI to run in a different process?

" The design of VST 3 suggests a complete separation of processor and edit
controller by implementing two components. Splitting up an effect into these
two parts requires some extra efforts for an implementation of course. 
But this separation enables the host to run each component in a different
context. It can even run them on different computers. Another benefit is
that parameter changes can be separated when it comes to automation. While
for processing these changes need to be transmitted in a sample accurate
way, the GUI part can be updated with a much lower frequency and it can be
shifted by the amount that results from any delay compensation or other
processing offset."

 
> > The host needs to see every parameter tweak. It needs to be between the
> GUI
> > and the DSP to arbitrate clashes between conflicting control surfaces.
> It's
> > the only way to do automation and state recall right.
> 
> well, almost. as i mentioned, AU doesn't really route parameter
> changes via the host, it just makes sure that the host can find out
> about them. the nicest part of the AU system is the highly
> configurable listener system, which can be used to set up things like
> "i need to hear about parameter changes but i don't want to be told
> more than once every 100msec" and more. It's pretty cool.

Yeah. It's important to realise that at any instant 3 entities hold a
parameter's value:
-The audio processer part of the plugin.
-The GUI Part.
-The Host.

A parameter change can come from several sources:
- The GUI.
- The Host's automation playback.
- A MIDI controller.
- Sometimes the Audio processor (e.g. VU Meter).

If several of these are happening at once, some central entity needs to give
one priority. For example if a parameter/knob is moving due to automation
and you click that control - the automation needs to relinquish control
until you release the mouse. The host is the best place for this logic.
 Think of the host as holding the parameter, the GUI and Audio processor as
'listeners'. Or the host's copy of the parameter as the 'model' and the GUI
and audio processor as 'views' (Model-View-Controller pattern).

Best Regards!,
Jeff









More information about the Linux-audio-dev mailing list