I think you are in error considering these things
mutually exclusive.
Yes, hosts dealing with MIDI binding is how things should be done, but
crippling a plugin API to not be able to handle MIDI is just that:
crippling. Maybe I want to patch up a bunch of plugins to process MIDI
events, or have some MIDI effect plugins: these are certainly
reasonable things to do.
Hi dr,
I think we mis-communicated. MIDI is *fully* supported, including SYSEX,
delivered to plugins as raw unmolested bytes. Plugins can and do function as
MIDI processors.
The 'Guitar de-channelizer' is supplied as an example MIDI processor with
the SDK, as is the 'MIDI to Gate' plugin.
The idea of binding MIDI to the plugin's parameters is a purely optional
alternative.
LV2 UIs are also like this, though there is an
extension to provide a
pointer to the plugin instance to the UI.
In theory this should only be used for displaying waveforms and such,
and always be optional.
How I display waveforms is the API has a function sendMessageToGui(), that
sends an arbitrary bunch of bytes to the GUI in a thread-safe manner. You
can build on that to send waveforms etc. Neither DSP nor GUI needs a
pointer to the other (but they can if they *really* want to).
Your argument sounds very obviously right because
it's about numeric
parameters, but note and voice control is trickier. That involves
inventing a new, better event format.
I will disagree and say MIDI note and voice control is pretty good,
*provided* you support MIDI real-time-tuning-changes ( this is an existing
MIDI SYSEX command that can tune any note to any fractional pitch in
real-time. AKA micro-tuning) ...and.. support "Key-Based Instrument Control"
(another little-known MIDI command that provides 128 per-note controllers ).
By supporting these two MIDI commands you get the familiarity of MIDI with
the addition of:
* Fractional Pitch.
* Per note controllers.
By binding MIDI to the plugin parameters as 32-bit 'float' via meta-data,
you remove the need to support MIDI explicitly, you kill the dependency on
MIDI's 7-bit resolution, and you remain open to in future extending the API
to support OSC.
GMPI parameter events include an optional 'voice-number', this extends the
MIDI-binding system to note events and polyphonic aftertouch. I can build an
entirely MIDI-free synthesiser, yet the metadata bindings make it fully
MIDI-compatible.
This is precisely the kind of reason why monolithic
non-
extensible specifications suck.
GMPI is extensible too. For example MS-WINDOWS GUI's are provided as an
extension (so the core spec can be adapted to other platforms), as is
support for some SynthEdit specific feature that don't really belong in the
core spec.
Best Regards!,
Jeff