[linux-audio-dev] XAP spec - early scribbles

David Olofson david at olofson.net
Thu Feb 27 14:27:00 UTC 2003


On Thursday 27 February 2003 07.47, torbenh at gmx.de wrote:
[...]
> > There's only one call; process(). (Or run(), as some prefer to
> > call it.) This is called once for each block, whether or not the
> > plugin processes audio. Might sound odd, but considering that the
> > unit for event timestamps is audio frames, it turns out to be
> > really rather logical - in theory as will as implementation.
>
> Yeah youre right.
> But then i have to implement graph ordering.
> which galan does not do at the Moment.
> I will also have to isolate the OpenGL stuff from
> the audio processing... Well i have to do that
> anyway...

Yeah. Without strict graph ordering, you get effectively "random" 
latency in every connection. At best, it depends in non-obvious ways 
on the order in which plugins are connected, or something like that. 
I don't think that's a good idea - not even for blockless processing.


[...]
> > This suggests to me that "look-ahead" is a special case interface
> > for some specific cases; not an idea to build a real time API
> > upon.
>
> yes...
> a sequencer could have a clock and a prepare-clock
> and then outs and prepare-outs....
> this fits on top of XAP.

Sure, but I can't quite see where it would be useful. It's just a 
means of squeezing inherently non-real time features into a strict 
real time system. Real time controls should never need this, because 
that would make it impossible no use the plugins in *true* real time 
systems.


> hmmm... but what if
>
> an event gate:
>
> if it gets an event on input
>   if the last event on tocomp == the last event on comp
>     output event from input.
[...]
> this is not very excact but it is possible to build such
> a mesh in galan. hope you get what i mean from the text....

I'm not sure... Is this basically about splitting actions up into two 
events; one "this is what we'll do" event, and one "do it now" event?


[...]
> in XAP this event feedback would already be illegal.

No, feedback isn't illegal in any way. It's just "late events" that 
are illegal; only events for this or future blocks must even be found 
in an input event queue.

If you set up a feedback loop, you have to make sure events are 
delayed appropriately, so timestamps remain valid. This automatically 
eliminates block size quantization issues, as the delay must be at 
least the duration of one block. If you need shorter feedback 
latency, use smaller blocks. If you need less than one sample, well, 
then it's time you think about implementing a new plugin. You'll 
probably have to use approximations, prediction filters and/or 
oversampling.


> (but if the code was smart enough it could solve
>  the cycle at the delay)

Plugins won't and shouldn't care. Feedback is a host implementation 
issue.


> in galan this is legal but provides the user with
> the possibility to build meshes which are not realtime
> safe... you can also build for loops etc...

Right. This is a fact of nature we have to deal with. Feedback loops 
*always* have some latency (be it nanoseconds or milliseconds), and 
there's nothing we can do about it, except make sure we know where 
those latencies are, and consider them when tuning our designs.


> if there was a component which converted a string event
> to a sample event, this could be connected before the
> delay.
>
> so the sample would already be loaded when it is time
> for the delay to let the event pass...

There's simply no other way of doing it. It could theoretically take 
*minutes* to load all the samples for a song. (At least if they're on 
CDs in a CD-ROM changer...) What can the API or plugins do about 
that...?


> the sample loading component would fire a worker thread,
> which inserts the event with the same timestamp
> as the incoming event... (at this point the timestamp is
> in the past)
> the evtdelay adjusts the timestamp to be in the future
> again.

I don't see much point in this. How do you figure out the delay value?


> i take this approach for midi in also...
> (not implemented yet)
> the midi event has a timestamp from alsa, which corresponds
> to the past. without the delay it would be processed now..
> this would generate jitter.
> with the delay some latency is imposed but the jitter is away.
> the user can adjust the delay to his machine...

That's very different. This is what you're expected to do with 
incoming real time events, and the resulting constant latency is 
strictly defined by the block size. That's all there is to it. It has 
nothing to do with worker callbacks, delayed events or other "tricks" 
to deal with plugins that have non-RT safe controls.


> > > this seems to be not feasible, because the event peeker would
> > > be fairly complex it must not follow the code path of the clock
> > > posting events to itself because of the infinite loop etc...
> > > not taking into account bigger event feedback loops....
> >
> > The "event peeker" has to be part of the clock and send events
> > that are part of a special protocol for this task.
> >
> > We have something similar in XAP, meant for hard disk recorders
> > and other plugins that need pre-buffering time before they can
> > start playing from a random song position. A sequencer (or
> > rather, timeline) can inform such plugins about loop and jump
> > points beforehand, so that specific pre-buffering can be used to
> > make it possible to jump to these points with zero latency.
>
> ok.. it is time for me to look at the XAP specs...
> where are they ?
> as it seems you are already defining the higher levels...

We have discussed a few of the higher level concepts, but there's 
still a lot of work to do in that area.

As to the actual spec document, I have yet to see it myself! :-) So 
far, all that's available is the list archives (ouch!) and the 
terminology document on the site:

	http://xap-plugins.org


Tim Hockin is working on the first preliminary spec. For now, you I 
suppose you can bug us with questions. If you try hard enough, we 
might give in and release what we have. ;-)


[...]
> > > yes... But due to the event peeking code it would get the event
> > > 100ms before it is due. The event peeker is too complicated
> > > though.
> >
> > In fact, it's not even possible to implement in a generic way.
> > (See above.)
>
> i think i have found a method above...

I don't think so. You can't see into the future, so you must use a 
delay. When you're dealing with non-deterministic operations (which 
is the only case where you need to mess with this stuff), you can't 
figure out what a sufficient delay time would be. When dealing with 
"live" input, delays are just not an option, as they'd defeat the 
whole purpose of real time processing.


[...]
> Yes... but galan has no stop. It is always running.

Just like hardware synths - and XAP should work the same way. The way 
I see it, very few hosts have valid reasons to ever stop processing.

What I'm talking about is the *sequencer*, which will always have 
transport control (with stop, start, rewind etc), and that's where 
this comes in. When you load a song, you'll have to initialize the 
net before you can start playing, just as with external hardware 
samplers that need to grind and rattle some before you can play.


> But a change sample will fire a worker and leave the old sample
> until the new sample arrives.

Yes, but that's just a plugin implementation detail. The only API 
implication it has is that such controls should be marked as "may not 
respond instantly" - and not even that is strictly required. (MIDI 
samplers don't have such a feature, AFAIK.)


[...]
> do want to have audiality running in the kernel ?

Actually, the "second generation" of Audiality (the "real" one is the 
third) was intended to run in kernel space, but that was before 
Linux/lowlatency. Back then, the only way to get solid RT performance 
on Linux was through RTLinux, so that's what I went for.

Anyway, both RTLinux and RTAI can schedule hard RT threads in user 
space these days, so there's still no need to run in kernel space, 
even if you need lower latency than Linux/lowlatency can provide.


[...]
> too bad i missed the XAP discussions...
> but i had to waste my time with 1.Semester Computer Science
> (at least i heard Math 3 which was kind of interresting
>
> it gave me a different look (more algebraic than the engeneers)
> on the Analysis

Can't see how that can be wasted time. :-)


[...ebuild...]

Sorry! I should have know that, being that I've considered using 
Gentoo. The name just didn't ring a bell somehow, and the first 
document I found didn't say what the tool is *for*... :-)

Anyway, the explanation is simple; I don't support packages at all.
Whatever is there is old contributed stuff that came from Kobo Deluxe, 
which is where Audiality was born. It most probably doesn't work. I 
should clean up my scripts...


//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`-----------------------------------> http://audiality.org -'
   --- http://olofson.net --- http://www.reologica.se ---




More information about the Linux-audio-dev mailing list