[linux-audio-dev] XAP and Event Outputs

David Olofson david at olofson.net
Wed Dec 11 14:35:01 UTC 2002


On Wednesday 11 December 2002 18.54, Tim Goetze wrote:
> David Olofson wrote:
> >On Wednesday 11 December 2002 15.25, Tim Goetze wrote:
> >> David Olofson wrote:
> >> >So, sort them and keep track of where you are. You'll have to
> >> > sort the events anyway, or the event system will break down
> >> > when you send events out-of-order. The latter is what the
> >> > event processing loop of every plugin will do, BTW - pretty
> >> > trivial stuff.
> >>
> >> what you describe here has a name: it's called queuing.
> >
> >Of course. But it doesn't belong in the event system, except
> > possibly as a host or SDK service that some plugins *may* use if
> > they like. Most plugins will never need this, so I think it's a
> > bad idea to force that overhead into the basic event system.
>
> above, you claim that you need queuing in the event system,
> and that it is 'pretty trivial stuff', in 'every plugin'.
> now you say you don't want to 'force that overhead'.

I did not say that; read again. I was referring to "the latter" - 
that is "keep track of where you are".

That is, look at the timestamp of the next event, so see whether or 
not you should handle the event *now*, or do some audio processing 
first. The second case implies that you may hit the frame count of 
the current buffer before it's time to execute that next event.


Either way, this is not the issue. Allowing plugins to send events 
that are meant to be processed in future buffers is, and this is 
because it requires that you timestamp with musical time in order to 
handle tempo changes correctly. *That* is what I want to avoid.


> >> >Do event processors posses time travelling capabilites?
> >>
> >> delays based on musical time do, whatever you like to call
> >> it.
> >
> >Then they cannot work within the real time net. They have to be an
> >integral part of the sequencer, or act as special plugins for the
> >sequencer and/or the editor.
>
> so eventually, you'll need a different event system for
> plugins that care about musical time.

No. You'll need a different event system for plugins that want to 
look at future events.


> and what if you come
> to the point where you want an audio plugin that needs to
> handle musical time, or prequeued events? you'll drown in
> 'special case' handling code.

Can you give me an example? I think I'm totally missing the point.


> i'm convinced it's better to design one system that works
> for event-only as well as audio-only plugins and allows for
> the mixed case, too. everything else is an arbitrary
> limitation of the system's capabilities.

So, you want our real time synth + effect API to also be a full-blown 
off-line music editing plugin API? Do you realize the complexity 
consequences of such a design choice?


> using audio frames as the basic unit of time in a system
> producing music is like using specific device coordinates
> for printing. they used to do it in the dark ages, but
> eventually everybody agreed to go independent of device
> limitations.

Expressing coordinates in a document is trivial in comparison the 
intheraction between plugins in a network. Printing protocols are 
rather similar to document formats, and not very similar at all to 
something that would be used for real time interaction between units 
in a net. But that's besides the point, really...

To make my point clear:

We might alternatively do away with the event system altogether, and 
switch to blockless processing. Then it becomes obvious that musical 
time, as a way of saying when something is supposed to happen, makes 
sense only inside the sequencer. Synths and effects would not see any 
timestamps *at all*, so there could be no argument about the format 
of timestamps in the plugin API.

As to plugins being *aware* of musical time, that's a different 
matter entirely.


//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`---------------------------> http://olofson.net/audiality -'
.- M A I A -------------------------------------------------.
|    The Multimedia Application Integration Architecture    |
`----------------------------> http://www.linuxdj.com/maia -'
   --- http://olofson.net --- http://www.reologica.se ---



More information about the Linux-audio-dev mailing list