Either I'm misunderstanding the answers, or I have not done a good job of
asking my question.
In more detail, here's what I'm curious about how people have done:
The sequencer has a clock, it nows what time 'now' is in bars:beats:ticks.
Events are stored somehow, encoded to a time in bars:beats:ticks. These may
be added on the fly to any time, and the sequencer must be able to hop
around non-linearly in time ( looping, jumping to marks, etc). How does the
sequencer engine find events stored for 'now', quickly enough that we can
be somewhat deterministic about making sure it can get all the events for
any time? ( 'now' may even be different on a track by track basis ).
Does it look up 'now' in some kind of hashed pile of events, where events
are keyed by a time? This makes me worry about hashing algorithms, but
would sure be the easiest to implement.
Is there some kind of master timeline array that events get attached to?
This seems like it would be quick to seek to a point, but use up a lot of
ram for the timeline array and I'm not sure how one would handle unlimited
length timelines.
I'd not clear how the above have to do with communicating between threads
using ringbuffers, I'm just talking about how the audio call back stores
events for a given time and then finds them quickly at that time. But maybe
I'm totally missing something here.
Would love to hear in pseudo code how others have tackled storing and
finding events in time.
thanks!
Iain