[linux-audio-dev] XAP: a polemic

David Olofson david at olofson.net
Mon Dec 16 17:15:09 UTC 2002


On Monday 16 December 2002 14.23, Frank van de Pol wrote:
[...]
> I thought the assumption was to send only events scheduled for the
> block te be processed. If that is the case all tempo based, musical
> time references can be converted into sample relative time (eg.
> sample number within the block). If the block is to be rendered;
> it's too late make changes. Block sizes[1] determine your latency.
>
> If you are scheduling events ahead of the buffer, a few interesting
> things happen:
>
> - the plugin needs to have some events queued in musical time
> reference (tick) to be able to anticipate tempo changes no known at
> the time the event was scheduled (eg. for synching he position)

Yes.


> - In case of real-time changes to the song, events would need to be
>   adjusted/deleted/inserted. This is something better handled at
> the sequencer host

Although you'll have to do this anyway, if you're delaying events by 
musical time and things like that. There's no way around this.

My argument againt building it into the host:

What do you do with events that "fall outside" as a result of an 
"unexpected" event? Throw them away? (And leave hanging notes.) 
Squeeze them in? (And trig something out of sync.) Squeeze them in, 
but align them to some note value?

The plugin most probably knows, or the user can tell it what's best 
to. The host does *not* know what to do. It would need to have a 
bunch of callbacks, or advanced rules, so that plugins can implement 
or express their intentions with prequeued events somehow.

That is, a prequeue sequencer for plugins that need one isn't 
something that you can wrap in a trivial "host->enqueue_event()" 
call. If a generic implementation is at all useful, it should be an 
optional part of the plugin SDK.


> - In case of repositioning, skipping, reversing and other things a
> user can do with her jog dial, the queue needs to be adjusted
> appopriate. Same as #2

Yep.


[...]
> [1] - in case of lots of blocks in series; the latency might end up
> being quite substantial, or am I overlooking something? If this is
> true, a setup (I compare it to my mixing desk with some outboard
> processors) with varying number of plugings for every channel would
> have different latency for differenct channels.

There's no latency added when you chain plugins, except for the 
latency that the DSP algorithm of each plugin might add. (Which by 
all means, can be significant!) Within an engine cycle, all plugins 
start processing at the same audio sample count.

This is why plugin scheduling order matters! You must run plugins in 
reverse order of dependencies, so the output from one plugin is in 
place before the next plugin starts processing.

This is really rather obvious if you think about running one single 
block's worth of data through a net. You just don't process before 
you have your input data.


//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`---------------------------> http://olofson.net/audiality -'
   --- http://olofson.net --- http://www.reologica.se ---



More information about the Linux-audio-dev mailing list