On Monday 16 December 2002 14.35, Paul Davis wrote:
[...]
* Host sends
buffers of N samples to plugins, with a starting
timestamp * Things send timestamped events to plugins
* Timestamps are measured in whole audio samples
* Host/timeline must export/deliver:
- SAMPLE RATE (samples/sec - passed at instantiation)
- TEMPO (sec/tick - event?)
- PPQN (ticks/qn - host-global? event? fixed?)
i suggest forgetting this. VST sets it to 1 PPQN, and treats "QN"
as "1 beat", so the information is really quite useless. there's no
need for it given TEMPO and METER.
Yes, almost. I think it's useful to be know about the METER, in case
someone tells you something about "N bars", or "every beat" or
whatever.
[...]
* can we take
any of this tracking work away from the plugins?
well, i think you haven't yet defined quite how the plugin gets the
information. after the discussion about callbacks, i return to
favoring events delivered to a Well Known Control. to flesh out
what you suggest about:
SAMPLE_RATE (samples/sec, passed at instantiation)
TEMPO (samples-per-beat, delivered as an event, see below)
METER (beats-per-measure + beat-note-value, delivered as an
event)
the TEMPO event needs to include:
samples-per-beat (as a floating point value) [ time between
beats ]
ramp slope (possibly 0) [ for accelerando and ritard ]
ramp target (irrelevant if ramp slope is zero) [ ditto ]
It looks like Controls will take two or three forms of events:
SET(..., value)
RAMP(..., target, duration)
SPLINE(..., target, target_slope, duration);
Basically another form for the same thing. I find this representation
slightly more logical, since it gives you an explicit duration, which
lets you insert a "stop here" event into your own queue, or whatever.
You can do that either way, but since timestamps are only sample
accurate anyway, I see a risk of target + slope opening up for
different interpretations of exactly when the slope ends.
OTOH, that could actually be allowed for, and seen as a feature - but
how useful is subsample accurate timing that only applies to the ends
of ramps, but nothing else?
Just one issue: Is float32 sufficient for tempo? (We'd need to switch
to 64 byte events, or use double events to handle 64 bits +
ramping...)
Position could just be double, and no ramping, right? (No problem
fitting that in a 32 byte event.) Provided that adding a float to a
double should extend the float to double first, you'd actually have
to *make* this go wrong explicitly, to break anything.
the METER event needs to include
beats-per-measure (floating point value) [ 3, 5, 7, 9.5 etc ]
beat-note-value (floating point value) [quarter,1/16th, etc]
Would two floats (32 bit) do?
Ok, that would mean we're dealing with special events (rather than
standard Control events), but in most cases, I think that just makes
implementation easier anyway. It has no effect on the "Control
export" and connection stuff anyway, since that's just about abstract
objects anyway.
in addition to this, the host/API needs to provide two
other
functions:
"get current time information"
- the structure needs to be based on the VST time info structure.
it will indicate the sample position of the next beat, and
the next bar.
Good idea. VST gives you a time info for the first sample in tho
buffer, which seems like a good way of complicating things for the
host, for no good reason. (Provided that plugins get accurate
timeline date by other means, that is. That's not the case in VST.)
it will indicate the state of the transport,
I think you need events *as well* for that, since transport events
can occur at any time; just not between bars. (Though, most people
set loop markers in between bars, I'd think.)
including loop information in the way that
JACK's time info
structure does.
Yes, that could make it easier for some musical time dependent
plugins to figure out what to do with events that don't "fit in"
after a transport event.
it can only be called from the
"process"
handler of the plugin, because the information is all
specific to the current block being processed.
Yes.
But I still like the idea of multipli timelines. :-)
What I had in mind was for whoever maintains the timeline, to also
send time info structs by reference, to those that want them. As
events, that is, so you get the same Channel addressing capabilities
as for normal Controls.
Alternatively:
host
{
/* For timeline "generators" */
int (*register_timeline)(...timeinfo callback&stuff...);
/* For whoever wants to know what they're sync'ing to. */
XAP_time_info *get_time_info(int timeline_id);
}
Now, if you're a sequencer (plugin or not), just register your time
info callback with the host. Mark timeline related events you send
with the return value - which is your timeline's host global handle.
(Or have a control for this, if it doesn't fit anywhere. It should
fit in the POSITION_CHANGE event, though - and you'll *have* to get
one of those first thing the sequencer is started.)
A plugin that wants more than what it gets through the timeline
controls, can just look at one of those events and ask the host about
the current XAP_time_info for the timeline that event came from.
Not *that* complicated, and now, hosts don't have to understand
*anything* about timelines. The host just provides an indirect way
for plugins and other "objects" in the system to ask each other
directly about this stuff. The host would just look up the right
timeline and forward the call.
BTW, I think it might be a good idea to send a notification event to
plugins that care about position whenever the meter is changed, or
something, so they can ask for the new XAP_time_info, in case they
don't always do that every block.
You *shouldn't* need to do it more than once per bar, if each one is
valid for one bar, but if someone edits the timeline, something must
tell you that your data is no longer valid...
"convert musical time"
- passed a musical time (B|b|t) and an indicator of which
timeline to use, it returns the audio frame at which
the event will occur, independent of transport state.
How is that possible? (Provided "audio frame" referst to free running
time as used for event timestamps and implicitly, audio.)
if transport state matters, the plugin will have
to
handle that itself. this function doesn't require host
information, but it does require a way to access
a (possibly) shared timeline and it needs to be implemented
only once :)
Well, anyone can "export" a timeline through the callback forwarding
system I proposed above. Will that do?
As to only implementing once; how about a plugin that is *only* a
timeline? You'd use one to drive a sequencer, or you could just hook
it up directly to any musical time aware plugins, if you have no need
for a sequencer.
i haven't followed the rest of the XAP discussion
to know how Well
Known Controls would be established,
Like any controls. From the host/plugin registration and connection
POV, they just need a type ID, so you can't mix them up with controls
of incompatible types, but that's in place already. (For float and
string controls. We might want to have double as well, although
special-casing POSITION_CHANGE would do.)
As to implementation, it's just a matter of using custom events, if
desired. I can't see a real need to handle these as if they *really*
were normal Controls (ie running PITCH into POSITION_CHANGE) - and if
you *really* want to do that, you can hack a converter plugin anyway.
I think it's more impontant to keep this as clean and simple as
possible.
but if a plugin needs tempo
sync it should go through whatever steps are necessary to define a
"tempo" control, which will receive TEMPO and METER events as
necessary.
Yep. And POSITION_* events, IMHO, to ensure that plugins don't drift
out of sync, even if they don't care to ask for XAP_time_info all the
time.
we might add a BEAT event, which would be delivered by
the host on
an intermittent basis, and would contain:
beat number
tick number
I would think you only really need one of these, since you can
calculate the other if you really need it. (Ask for time info, or
track meter changes, or whatever there is.)
Anyway, these would both have to be doubles (64 bit), since integers
would introduce large rounding errors (can't have *both* timestamps
and positional data as integers), and float (32 bit) have
insufficient resolution.
We *can* squeeze two doubles in an event, but that's it. The header
(next, timestamp, action, target) is 16 bytes. (Well, we could go 64
byte/event, but not for a single special case like this.)
sample position
That's the event timestamp - unless you're referring to audio time as
used by hard disk audio tracks? Is that really needed, if musical
time is subsample accurate?
this will allow tempo-synced plugins to align to what
has been
called the tick edge. beats are sufficiently intermittent that this
doesn't cause an event overload even if one is sent for every beat.
however, there is no need to do this - it would be find to deliver
one every time the transport state changes to "moving" so as to
resync everybody who cares.
Exactly.
//David Olofson - Programmer, Composer, Open Source Advocate
.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`--------------------------->
http://olofson.net/audiality -'
---
http://olofson.net ---
http://www.reologica.se ---