[linux-audio-dev] XAP: a polemic

Tim Goetze tim at quitte.de
Sun Dec 15 10:50:01 UTC 2002


David Olofson wrote:

>If this does not demonstrate why I think NOTEPITCH is useful, I 
>frankly have no idea how to explain it, short of implementing both 
>alternatives in code.

i agree that the ability to discern different scales is handy
indeed. but the only clean way to implement it is by going
back to integer note numbers.

this definitely is something plugins shouldn't have to care 
about i think. a reasonable approach is to leave the generation 
of correct pitch values to the sequencer -- if it does in fact
support multiple tunings, it is bound to know how they map to
pitch. no need to duplicate this knowledge in the api.

or just go 12. / octave and meet me in politically incorrect
westerners' hell -- i'll buy you a beer there if you manage
to keep your future posts a little shorter.

>Musical time *stops* when you stop the sequencer, which means that 

>So, for example, you can't change controls on your mixer, unless you 
>have the sequencer running. How logical is that in a virtual studio?  

this is a non-issue. if time stops but processing does not,
all plugins keep processing events <= 'now', and the events
they emit are stamped 'now'. that's how i do it anyway.

you can proceed counting time in stopped state, but it's 
void of any meaning within transport time to do so, whether 
you're counting samples or beats.

[bbt system]

>IMHO, it should not be hardcoded into sequencers, and 
>definitely not into APIs.

in fact i find myself happy without bbt mappings in the 
sequencer core, yes. others may have other needs.

>You *may* know about the future along the sequencer's timeline, but 
>you do *not* know what the relation between audio time and musical 
>time will be after the end of the current buffer.

you *will* know if you have a central tempo map -- which you 
have when you have a sequencer around, or instead confine 
yourself to one tempo, one beat, ie. the linear mapping case.

>After you return from process(), event scheduled (be it with sample 
>count or musical time as a timestamp), someone might commit an edit 
>to the timeline, there could be a transport stop, or there could be a 
>transport jump. In either of those cases, you're in trouble.

you're not. if you wouldn't check all queues on transport 
state change you don't understand enough of the workings of 
a sequencer yet.

you may also want to synchronize changes to the tempo map and
the loop points to be executed at cycle boundaries, which is 
how i am making these less invasive, but that's another story.

>> >	* Is there a need for supporting multiple timelines?
>>
>> this is a political decision,
>
>I disagree. It's also a technical decision. Many synths and effects 
>will sync with the tempo, and/or lock to the timeline. If you can 
>have only one timeline, you'll have trouble controlling these plugins 
>properly, since they tread the timeline pretty much like a "rhythm" 
>that's hardcoded into the timeline.

to put this straight: there is no difference between musical 
time and 'real' time. the funny thing is if you're rooted in 
one view, the other appears non-linear. but the mapping between 
the two is an isomorphism no matter how you look at it.

the trouble only starts when you have multiple mappings
between musical and transport time, not when you have but one.

back to the question, it is a decision that has deep technical
implications, yes. but i insist it is largely political because 
few will ever need multiple concurrent 'timelines', and many 
will have to pay a price to enable them. 

i can only say i don't need them in musical practice, and they 
are uncommon enough to let those needing them do the maths
themselves. one of the few things common to all musical culture
seems to be that there is one predominant rhythmn, if there is
rhythmn at all. 

>It doesn't seem too complicated if you think of it as separate 
>sequencers, each with a timeline of it's own... They're just sending 
>events to various units anyway, so what's the difference if they send 
>events describing different tempo maps as well?

the point in having a sequencer is to have it become the
central authority over tempo and time. the idea of sending 
tempo change events is, i'm afraid, another sign of some
lack in understanding sequencers.

and to make the point clear: you don't send tick events
either, that's for synchronizing external equipment. you
simply don't need it with a central time/tick/frame mapping.

>Maybe it will be in most cases, but I can't see any real reasons why 
>you couldn't implement it as a reasonably normal plugin. 

there's absolutely no reason for the api to cover sequencers
in plugin guise. it is a one-one relationship to the host. to
be able to sequence every event in the network, you must assume
the sequencer's access to the network to be about equivalent to 
the host's access.

>Yes, but you still have to deal with tranport events. No big deal, 
>though; you just have to tell everyone that cares about them, so they 
>can adjust their internal "song position counters" at the right time.

transport control is no event because it invariably involves 
a discontinuity in time, thus it transcends the very idea of 
an event in time.

and plugins don't have an internal 'song position counter'.
they rely on the host/sequencer to keep track of the passage
of time, that's the whole point.

tim




More information about the Linux-audio-dev mailing list