[linux-audio-dev] Plugin APIs (again)

David Olofson david at olofson.net
Wed Dec 4 14:48:01 UTC 2002


On Wednesday 04 December 2002 18.19, Tim Hockin wrote:
[...]
> > It seems like you need two sets, per instrument and per voice,
> > they should share the same interface, and the per voice ones
> > definatly should not be limited to modulation,chorus,whatever.
>
> what if each control has an int per_voice; field.  If the field is
> 0, that param is not modulatable per-voice.  If it is > 0, it is an
> instrument-unique ID which can be used by the host to modify that
> param:
> 	plug->voice_change(plug, voice_id, param_id, value);
> (or whatever it will look like if we go the way of all events).

Sounds like a good idea. The overlap between voice and channel 
controls on most synths is big enough that I can't see a proper 
motivation for two separate interfaces. It *might* make sense to have 
two variants of the events/calls, though;

	plug->channel_change(plug, param_id, value);
	plug->voice_change(plug, voice_id, param_id, value);

The alternative would be what you suggested; testing voice_id for 0, 
or -1 or whatever. (I prefer -1, as everything starts at 0. :-)


> Can we assume that all voices have a standard control mapped to
> velocity?

No. Organs don't have velocity, for example, so why should virtual 
organs implement it, or even get the information? (Ok, I know that 
most synths implement velocity for organ sounds anyway...)


> Or should the per-voice controls actually be another set
> of controls, and each instrument needs to specify
> velocity,pitchbend (and others)?

Well... In the case of Audiality, you can assume *nothing* about a 
patch/instrument. A channel is routed right into a patch driver, 
which can be a standard mono or poly synth, a MIDI file player, a 
script parser/VM or whatever. If you loud the patch "CoffeeBrewer", 
it may *really* be an interface to your coffee brewing machine, 
rather than something that drives the synth engine.

So, I would say it can be a bit hard for some synths to report a 
useful list of controls. On most synths, it all depends on the loaded 
patch/instrument. There may be a fixed set of controls, but that 
doesn't mean all patches/instruments make use of all of them. I would 
really prefer if the API would enable the host to display a list of 
controls that are *actually* supported; not just a list of what 
*might* be supported.

This is one aspect of MIDI that sucks. There are 128 controls, but 
most synths only support a fraction of them - and then most patches 
only support a fraction of *those* controls. As if that's not bad 
enough, most synths and patches aren't properly documented, so you 
simply have to test all controls, or dive into the programming mode 
of the synth, to find out what is supported, and how it works. 

Wouldn't it be great if every patch/instrument could report the 
controls it has, and perhaps even a short description of each one of 
them?


> As a similar: should instrument plugins be providing hints as to
> their MIDI-equivalent functionality?  CC-equivalent numbers per
> control, perhaps?

I'd prefer if we don't even mention "MIDI" in the API - but the 
terminology and the controls mentioned there might be a good starting 
point, though.

Either way, I think we need more than one enumeration, or at least, 
the enumeration needs to be "partitioned" in some way, so one can 
tell whether or not two controls with different type hints are 
approximately equivalent or not. For example, one might be interested 
in knowing whether a control affects level, frequency, modulation 
deepth or "effect depth". One might also want to know whether we're 
talking about volume, frequency, cutoff or whatever. Further, linear, 
quadratic, cubic, logarithmic etc could be of interest.

That said, we're still talking about *hints*. It's not like you're 
not going to be able to use a plugin just because the host doesn't 
understand what the controls are all about. The important thing is 
that the *user* understands what the controls are for - and a 
sensible name is usually sufficient for that.


> > My feeling is that just float+"arbitrary opaque binary data" is
> > enough. The float can be augmented with hints, enumerations,
> > whatever.
>
> String is needed if we want to deal with filenames without a custom
> GUI.

Just to make a general point about worrying about host constructed 
GUIs:

You need various special GUI widgets to handle things like envelopes, 
graphical oscillator waveform editors, spectrum editors and other 
cool stuff. In some cases, you also need special control types to 
deal with these in a sane way. Are we supposed to put all that in the 
API and all hosts, or are we going to lock that type of plugins out 
entirely?

My point is just that a line has to be drawn somewhere. IMO, it 
should be drawn so that communication that the host can have actual 
use of understanding should go through standardized protocols, while 
the rest can go through "raw data blocks" and similar generic 
solutions.

In this case; it *is* useful for the host to know about external 
files that plugins reference, since this information can server as a 
complete list of all data that is referenced by a project.

In general; it is *not* useful for the host to understand data that 
it isn't expected to be able to actually *manipulate*. IMHO, there 
should be as little as possible of this in plugins, but I can't see 
any way of eliminating it.

And mind you; I really tried. That's basically what the MAIA project 
was about. The project is effectively dead now, although many of the 
ideas live on in Audiality now. (The basic event system is almost 
identical, for example.)

That said, I actually came up with a rather simple solution, put in 
relation to the magnitude of the problem:

	* Put unique IDs on the data types, so hosts can know which
	  ports are compatible without understanding the data, or
	  even recognising the data type IDs.

	* Plugin coders should be encouraged to first look for
	  built-in standard data types that fit the bill, and only
	  invent new types when they're really required.

	* After inventing a new data type, you're encouraged to
	  implement a set of "mini plugins" (a toolkit, if you will)
	  to perform basic operations on the new data type. This
	  would include editors, displays, mixers and that sort of
	  things. These "mini plugins" should be registered under
	  special names or IDs, so hosts can recognize them as
	  "compatibility plugins", when they're faced with their
	  respective custom data type.


We could support a system like this for the "raw data blocks". Host 
and plugin authors may support it or not; the only differece is that 
when there is support in the host, and the plugin author cared to 
implement the "compatibility mini plugins", these mystical ports may 
actually be manipulated by the host to some extent. (If not, the 
standard procedures for preset saving and loading apply, and that's 
about it.)


> Opaque binary data has no meaning to the host, and the host is
> what manipulates controls.  Where does it get that data?

I don't totally agree with the idea that it is the host that 
manipulates controls. I would agree that having *only* controls, and 
no mystical private patch data, whould be cool - but I don't think 
restricting the API to supporting only known control types is a good 
idea. Some sort of "raw data blocks" will be needed by some plugins.

Whether or not this should be implemented as an actual raw data block 
type, or as external files (loaded by the host or the plugin?) is 
another matter.

I'm actually leaning more and more towards external files, as it's a 
simple and flexible solution, that allows easy sharing of data across 
projects, use of standard external editors and other stuff - and the 
host may still be nice and actually *understand* where these files 
belong, and what to do with them when saving projects for backup or 
transfer. (See one of my previous posts.)


//David Olofson - Programmer, Composer, Open Source Advocate

.- Coming soon from VaporWare Inc...------------------------.
| The Return of Audiality! Real, working software. Really!  |
| Real time and off-line synthesis, scripting, MIDI, LGPL...|
`-----------------------------------> (Public Release RSN) -'
.- M A I A -------------------------------------------------.
|    The Multimedia Application Integration Architecture    |
`----------------------------> http://www.linuxdj.com/maia -'
   --- http://olofson.net --- http://www.reologica.se ---



More information about the Linux-audio-dev mailing list