On Saturday 01 December 2007, Dave Robillard wrote:
[...]
Taking a step back, it would be nice to have these
events (being
generic) able to use something other than frame timestamps, for
future dispatching/scheduled type event systems.
[...]
I'm not sure about the scope of this LV2 event system, but the idea of
using timestamps related to anything other than audio time (ie
buffers, sample frames etc) seems to be a violation of the idea that
events are essentially structured control data or similar - ie
anything that's "locked" to the audio stream.
The only valid reason I can see for using an unrelated timebase for
timestamps is when the timestamps can only be translated to actual
(audio) time by the receiving plugin. This is an interesting concept,
but how to use it properly in the context of an event system where
the event transport is based on the "one buffer per buffer cycle"
idea? Sure; you can just send events and assume that the receiver
will queue them internally as needed, but how does the sender know
how far ahead it needs to be to avoid events arriving late? (If it
knows the audio/timestamp relation, why use non audio related
timestamps in the first place?)
I just don't see how this can work; not in a real time system. Either
you need to deal with exactly one buffer at a time - and then you may
as well use audio based timestamps at all times - or you need some
sort of random access event system or something.
[...]
(This may sound a bit esoteric, but to do something
like Max right,
you need absolute time stamps).
What does max do that requires this, and how does it actually work?
I'm probably missing the point here...
[...]
//David Olofson - Programmer, Composer, Open Source Advocate
.-------
http://olofson.net - Games, SDL examples -------.
|
http://zeespace.net - 2.5D rendering engine |
|
http://audiality.org - Music/audio engine |
|
http://eel.olofson.net - Real time scripting |
'--
http://www.reologica.se - Rheology instrumentation --'