[linux-audio-dev] Aeolus and OSC - comments requested
fons.adriaensen at alcatel.be
Fri May 13 09:54:31 UTC 2005
On Fri, May 13, 2005 at 02:18:38AM +0100, Steve Harris wrote:
> My preferred form would be something like
> /std_prefix/inst_name/base_freq f <base-frequecy>
> /std_prefix/inst_name/note_on iff <note-id> <octave> <velocity>
> /std_prefix/inst_name/note_off if <note-id> <velocity>
What's the <octave> param for ?
> But seen as I've never written an OSC synthesiser, I dont get a vote.
:-) I'm sure you could write a toy synth in less than an hour :-).
I'm not even sure a hard standard would be a good thing.
One of my gripes with midi has always been that it imposes a
keyboard-centric model and worse, the 'voice == note' convention,
on all types of instruments (and also on sequencers, note editors,
etc.). Any form of standard that gets accepted by a majority will
probably have the same problems, for the simple reason that this
model is now widely used, and deviating from it (which will add
complexity) is in the interest of only a minority of users.
For an organ, velocity makes no sense, but OTOH key action need
not be binary: on an instrument with mechanical action you can
press a key halfway and get very weird results. IIRC some
contemporary composers have used this. I've not yet made up
my mind if the /note_on format of Aeolus will support this,
but ideally it should.
Another open question is if I will use liblo or not. I have
some problems with it, and these are not related to the
quality of the code, but to how it is structured.
Like e.g. many GUI toolsets, liblo provides an easy to use
integrated solution that will work in most cases. It is
exactly that integration that gets in the way if you want
to use it in a more complex situation.
As it is, liblo will trigger events on its own built-in timer.
This is not what you actually need in a synth. When processing
an audio period, you want the events that are relevant to that
period, taking into account latency and processing delays.
In other words, you want events not in real time, but ahead of
it, and 'on demand'.
This wouild require an interface like e.g.
Event_id get_event (Event_t **event, OSC_time_t limit)
which will give you a pointer to the next event (if any) that
comes before 'limit'. The Event_t should include the timestamp
if any, so you can place the event exactly on the nearest sample
if that's what you want.
You should then have to choice to deal with the event yourself,
or otherwise call
int process_event (Event_id id)
that will do the path and wildcard matching, the registered
callbacks, etc., and finally
void event_done (Event_id id)
that signals to liblo that it can now release any resources
related to the event (adding this call allow a zero-copy
implementation inside liblo).
These three calls should be synchronous, non blocking, and
be able to execute in the context of any calling thread.
If you would provide this sort of interface, it would still
be possible to also provide a thread that, when optionally
started by the user, would reproduce the current behaviour,
soo nothing need be lost.
More information about the Linux-audio-dev