[linux-audio-dev] Plugin APIs (again)

Steve Harris S.W.Harris at ecs.soton.ac.uk
Wed Dec 4 13:30:00 UTC 2002


On Wed, Dec 04, 2002 at 09:19:50AM -0800, Tim Hockin wrote:
> > In modular synth systems there are good reasons why you want to know
> > whether things are connected (and good reasons why they wouldn't be), but
> > this spec so far is not useful to a modular synth, so in this case I
> > agree.
> 
> Can you expound, if only for curiosity?  Both reasons why it's useful for
> modular synths and why this API is not?

Its very common to have modules that change thier behviour whne cables are
attached, eg. to override a knob, or to tuern a knob form a value to a
level modulation.

The reason this is not useful in a modular systme is (mostly) the way they
deal with polyphony.

In a modular system, each oscilator etc, is monophonic and you clone whole
blocks of the system to make it polyphonic. Also, pitch is controlled via
CV control and gate, which doesnt map well to midi style note events.

Sure, you could coerce the described system into a modular voewpoint but
there would be a lot of overhead and neadless complexity.

> > I'm not sure what peoples opinions on numbers of outputs are, obviously the
> > number needs to be variable per instrument at development time, but I dont
> > think it should be variable at instantiation time, that doesn't sound
> > useful, and it would be hell to optimise.
> 
> I'd agree with this, but for a few exceptions:
> * Mixer plugin.  I really do want it to have a variable number of inputs.  I
> don't want to say 48-channels or 128-channels.

Right, but do mixer plugins belong in an instrument API? Be good at one
thing...

If you can give me an example of an instrument that benefits from variable
numbers of i/o and doesn't deserve to be a standalone jack client then I'l
agree with you.
 
> Can we assume that all voices have a standard control mapped to velocity?  Or 
> should the per-voice controls actually be another set of controls, and each
> instrument needs to specify velocity,pitchbend (and others)?

Here you could make use of well known labels again, note-on-velocity,
note-off-velocity, pitchbend etc. The host can map these to the normal MIDI
controls of it likes.

It probably makes sense to list the per-voice controls speratly from the
per-instrument. Its just important that they are the same at a fundamental
level (otherwise you end up with very confused developers and code).
 
> > My feeling is that just float+"arbitrary opaque binary data" is enough.
> > The float can be augmented with hints, enumerations, whatever.
> 
> String is needed if we want to deal with filenames without a custom GUI.
> Opaque binary data has no meaning to the host, and the host is what
> manipulates controls.  Where does it get that data?

The binary data comes from "GUIs", I still dont see how a generic UI can
usefully deal with filenames, it can pop up a file selector dialgoue, but
the user doesnt know what thier looking for.

- Steve



More information about the Linux-audio-dev mailing list