[linux-audio-dev] XAP again - channels, etc.

Simon Jenkins sjenkins at blueyonder.co.uk
Sat Mar 22 02:42:00 UTC 2003


Tim Hockin wrote:

>LADSPA is much the same way - connect anything to anything.  But several
>people in the XAP discussion feel that normalized data (0 to 1.0 or
>whatever) is bad.  I am still of the position that I could be convinced to
>support two basic control types: numerical (normalized) and other (strings,
>data block, etc).  This, however, is still not the same as audio-rate
>controls, which is what you get when you plug an oscillator into a knob.
>
>The simplest concept is that they are different things.  Audio and Control
>data.  Reconsidering this notion would takes us WAY back to early XAP
>discussions.  Maybe that is OK - anyone want to make a case for a new
>fundamental model?
>
Ok. Mostly for fun, though...

+++A Case for a New Fundamental Model+++

The current model isn't particularly fundamental. It compounds aspects
of the problem which are actually quite separate:

1. What kind of data is entering/leaving a port?
2. When and how often is it entering/leaving?
3. What do various control data types mean?
4. How is the meaning encoded?
5. How/when is the encoding compressed?
6. What can be sensibly connected to what?

It proposes one fixed set of answers for "audio" data and another,
different fixed set of answers for "control" data.

Is this a clean, natural division? I don't think so. Why does the
model treat the output of an envelope generator as being (in many
respects) more like a string than it is like an oscillator output?
Why isn't it clear what an LFO's output type should be? Why does
something as simple as a ramp spawn so much complexity?

Even where the audio/event distinction looks clear cut, its not. The
user toggles a switch. Thats an event, surely? Wellll... maybe it is,
but then, what does the plugin do with the event? If the switch is
going anywhere near the audio path then it probably makes itself a
little ramp or curve and uses that instead. Why? "To avoid clicks".
The ramp/curve is a crude low-pass filter! The plugin is doing audio-rate,
audio resolution signal processing on something the model tells us is
fundamentally different from and incompatible with an audio signal.

Something has to make this ramp/curve(/whatever), because at its
source (eg UI widget or incoming midi message) the toggling of a
switch *is* an event. But the rise and fall characteristics should
belong to the owner of the source, not to the destination. An app
that displays a button on its GUI should be able to determine the
characteristics of that button. Is it hard edged? Soft edged?
Asymmetrical? Different circumstances call for different behaviour.

Some undeveloped ideas towards an alternative model:

~ Rather than making the distinction between "audio" and "control"
data, make it between "signal-path" and "non-signal-path" data.

~ Signal-path data is at full audio rate/resolution and doesn't
contain unpleasant things such as instantaneous edges. This doesn't
necessarily mean that it's always represented by a buffer full of
floats in memory (though that would be one way to implement it) but
whatever turns up at the input must *encode* a buffer's worth of
data, sample-synchronised with all the other signal-path data in
the system.

~ Non-signal path data is event-like. Something happens at a
particular instant. The encoding is some reasonable representation
of what happenned. The instant is a particular sample period, ie
non-signal-path data is still sample-synchronised with signal-path
data, it just doesn't occur on every sample.

~ Anything which can start life as an event but end up interacting
with the signal path has two representations: The event representation
(a bit like MIDI) and the audio-like signal-path representation (a bit
like CV). Crucially, plugins don't directly combine signal-path and
non-signal-path inputs. A plugin which switches audio, for example,
has a signal-path switch input not an event switch input. A plugin
which switches events, OTOH, has an event switch input.

~ Finally, define some default conversion plugins which can "cast"
between various events and signals, allowing most things to be
connected to most others.

Simon Jenkins
(Bristol, UK)





More information about the Linux-audio-dev mailing list