On Friday 30 November 2007, Dave Robillard wrote:
That's why
I'm using a Port as the smallest "connection unit",
much like LADSPA ports, so there is no need for an event type
field of any kind at all, let alone a URI.
Ports /are/ the smallest "connection unit". But ports can /contain/
events, and if we want multiple kinds of events in a single port,
then the events themselves need a type field.
Sounds like there's a fundamental difference there, then. I'm using a
model where a port is nothing more than something that deals with a
value of some sort. There are no channels, voices, different events
or anything "inside" a port - just a value. An output port
can "operate" that value of a compatible input port.
Of course, that "value" could be anything, but I'm not explicitly
supporting that on the API level. If plugins want to use MIDI
messages, they're on their own when it comes to mapping of channels,
CCs etc. That stuff is really beyond the scope of my project, as one
wouldn't be able to configure and control such things normally.
The data in
the events *could* be MIDI or whatever (the host
doesn't even have to understand any of it), but normally, in the
case of Audiality 2, it'll be modular synth style ramped control
events. That is, one port controls exactly one value - just like
in LADSPA, only using timestamped events with ramping info instead
of one value per buffer.
The host might not have to (though in practise it usually does), but
other plugins certainly do. You can't process events if you don't
even know what they are.
Yes, obviously. I don't quite see what you think I'm trying to say
here. :-)
Extensibility
is a non-issue on this level.
OK, the event extension doesn't define your ramped control events,
so you're not allowed to use them, ever, period.
... looks like extensibility is an issue at this level, eh? ;)
Right, but that's mostly about Audiality 2 anyway. There, if I for
some reason started with control events without ramping, I'd add
another "control events v2" port type. If that type happens to be a
superset of the first one doesn't really matter, as they're still not
compatible.
Where it makes sense, one can provide converters to/from other types,
but to the host (the low level machinery directly dealing with plugin
graphs, that is), those are just ordinary plugins with only one input
port and one output port.
What you do if
you want
more stuff is just grab another URI for a new event based
protocol, and you get to start over with a fresh event struct to
use in whatever way you like. (In fact, as it is, the host doesn't
even have to know you'll be using events. It just provides a LIFO
pool of events for any plugins that might need it.)
Sounds like you're thinking too hard.
Nah. I'm just in the middle of another project, and the Audiality 2
code isn't in a state where I could post that without just adding to
the confusion. And, I think we might have a terminology impedance
mismatch. :-)
"Events" here are just a bunch of bytes in a
flat buffer.
Mine are implemented as linked lists of small memory blocks, for
various reasons. (I've had a working implementation for years, so
I'll stick with that for now. Not saying it's the best or most
efficient way of doing it, but I have yet to figure out how to bend
flat buffers around my event routing model - or the other way
around.)
I did "hardwire" fixed point timestamps as those are closely related
to the whole deal with sample frames, buffers etc - but the data area
is indeed just a bunch of raw bytes.
There is definitely no protocol here. Please, please
don't
say "protocol". That way lies painful digressed conversations,
trust me.
I'm open to alternative terminology. :-)
What I'm talking about is just the name of "whatever goes on between
connected ports." I don't want the term to be too specific, as it
also covers LADSPA style audio buffers, shared buffers (which can
contain function pointers) and whatever else plugins might use to
communicate.
//David Olofson - Programmer, Composer, Open Source Advocate
.-------
http://olofson.net - Games, SDL examples -------.
|
http://zeespace.net - 2.5D rendering engine |
|
http://audiality.org - Music/audio engine |
|
http://eel.olofson.net - Real time scripting |
'--
http://www.reologica.se - Rheology instrumentation --'