On Fri, 2012-08-03 at 14:18 +1200, Jeff McClintock wrote:
> For
historical interest. I did complete the GMPI prototype.
I don't suppose the code for those modular
synthesis plugins is
available? :)
I release as many as possible open source. Unfortunately before I used
plugins I coded everything as part of my application, so a most of the good
filters, Oscillators etc are not available as plugins, when I have time I'll
do more. The stuff I have released is included with the SDK...
http://www.synthedit.com/software-development-kit/
It's pretty specific to SynthEdit and the SEM SDK though so might not be
much help to anyone.
Cool, thanks.
> The SEM
DSP Plugin API has a total of four functions { open(),
setBuffer(), process(),
receiveMessageFromGui() },
> the remainder of the spec is covered by
metadata.
Did you consciously decide to use a setBuffer
method for some reason?
I consider that (connect_port in LADSPA/LV2) a mistake and efficiency
problem. Passing an array to process is better.
[Plugin]--buffer-->[plugin]
I use the same buffers over and over, so I inform the plugin once of the
buffer addresses, then I need not do so again. Not passing an array to
process() is less overhead on the function call... But I think any
efficiency gain would be minimal either way, passing an array to process is
probably just as good.
I guess you probably don't have that many ports, unlike LV2 where often
there are LOTS of ports due to parameters. That is a lot of function
calls.
It also annoys me because it forces plugins to have state they don't
really need. Buffers are only valid in process(). I have quite a few
very simple plugins that have state exclusively to keep these buffer
pointers around, otherwise they could be purely functional (and the host
could exploit that fact in convenient ways, especially when threads get
involved)
A requirement
specifically about a strong and sensible plugin/UI
separation would have been a good one. By far the worst thing about
porting VSTs with GUIs. A free for all for UIs to screw with DSP code
is insane.
Absolutely. The GMPI model is complete separation of GUI/DSP. The GUI has no
direct access to the DSP. When the GUI tweaks a parameter it does so only
via the Host. The GUI can run on you iPad, the DSP on you Linux box. No
special plugin code for that scenario, it's transparent. Very clean.
LV2 UIs are also like this, though there is an extension to provide a
pointer to the plugin instance to the UI.
In theory this should only be used for displaying waveforms and such,
and always be optional. In practice it is used by people porting shitty
VSTs, and unfortunately abused by people to do communication and all
kinds of awful things. I hate it with a passion, but c'est la vie.
With a powerful enough communication mechanism now established for
higher level control, hopefully at least the abuse will go away.
Fair enough, I
guess I will just list this as met. I felt compelled to
at least hint that LV2 is not the sort of project that would ever
consider including evil software crippling garbage as a requirement ;)
Understood. GMPI was for both commercial and free software.
As is LV2.
> "Including" MIDI is not a burden, it is
a trivial consequence of having
> a sane event mechanism....
[...]
Yeah, I do support MIDI via 'events', MIDI was
trivial....but.....
My concept with GMPI (not everyone agreed) was that MIDI was not required
*in* the plugin.
For example take your MIDI keyboard's "Modulation Wheel". Imagine the
function that parses the MIDI bytes, decides what type of MIDI message it is
and typically converts that 7-bit controller to a nice 'float' scaled
between -1.0 -> +1.0. Typically every single VST synth plugin has that
code. The inefficiency is that 1000 Plugin developers had to create and
debug equivalent code.
I decided the *host* should provide that routine. Written once, available
to every plugin developer. The plugin simply exposes a port, and the
metadata says "map this port to MIDI modulation-wheel".
The advantage is - very rich MIDI support is available to ALL plugins. Not
just the standard 7-bit controllers, I support CC, RPN, NRPN, SYSEX, Notes,
Aftertouch, Tempo, almost everything. The MIDI parsing code exists only in
one place so it's robust, and it saves plugin developers a lot of wasted
time duplicating existing code. Plugins are small and lightweight, but MORE
functional than the average VST plugin.
This is a good idea, but not having it possible for plugins to process
MIDI is still a bad one.
Plugins should not have mystery parameters and most hosts would support
this kind of binding anyway. Supporting binding in every plugin is
indeed insane.
With VST it's a mess. Developers MIDI support is
highly variable, almost
none support SYSEX for example. Half of them can't handle NRPN properly if
at all.
The other advantage is that with MIDI abstracted and presented as
normalized 'float' values, the host can substitute OSC or HD-MIDI control
without changing the API. Your existing plugins become 'hi-definition' and
'OSC capable' without any effort. You don't have to argue "MIDI vs
OSC"
because both are possible.
Back then I said 'no MIDI in GMPI' - it was interpreted as highly radical
and everyone hated the idea. I should have said 'MIDI parsing provided as a
service to the plugin'. You have to write the MIDI parsing anyhow, why not
make it available for everyone?
I think you are in error considering these things mutually exclusive.
Yes, hosts dealing with MIDI binding is how things should be done, but
crippling a plugin API to not be able to handle MIDI is just that:
crippling. Maybe I want to patch up a bunch of plugins to process MIDI
events, or have some MIDI effect plugins: these are certainly reasonable
things to do.
Making generic events is so simple that not doing is it just silly.
There is no point in arguing about whether GMPI or LV2 should "support"
MIDI, because if the spec is mandating the blessed types of events you
are allowed to use it has already failed. Just as MIDI does not address
the needs of everyone today, whatever anyone else comes up with to
replace it will not either.
Your argument sounds very obviously right because it's about numeric
parameters, but note and voice control is trickier. That involves
inventing a new, better event format. That's not making the choice of
MIDI or OSC go away, that's *adding yet another choice*. That's fine,
but it is without a doubt wiser to do so within a generic event system
where all of these can be used, or new types added in the future, rather
than expect you thought of everything anyone will ever need this time
around.
Providing good powerful solutions for this stuff is good. Painting
yourself into a corner and misusing authority to make overbearing specs
is not. As a developer I prefer my reasons for not doing something to
be something more sound than "some person on a committee somewhere
decided you are not allowed to" ;)
That said,
sure, MIDI sucks. Something better is needed, but certainly
not some One Ultimate Control Structure to the exclusion of everything
else.
All I've done with GMPI is recognise that MIDI controllers can be
'normalised' and presented to the plugin as nice hi-definition floats using
the event system (atoms?). The plugin 'has no MIDI', no crufty 7-bit crap,
yet is fully compatible with MIDI.
Forbidding plugins from talking MIDI is 100% loss, period. Everything
you've said may be brilliant, and MIDI may be terrible, but it's still
just crippling.
Like it or not, MIDI exists, and people want or need to process it. As
a fellow modular person I am somewhat surprised at your stance on this.
Why should a plugin spec make this useful task impossible? A good
plugin API should make possible everything that is sensible to do with
plugins. Processing MIDI is certainly a sensible thing to do with
plugins. This is precisely the kind of reason why monolithic
non-extensible specifications suck.
Cheers,
-dr