David Olofson wrote:
Well, considering that we seem to have virtually *no*
input from
people with solid experience with software sequencers or traditional
music theory based processing, I suggest we either decide to build a
prototype base on what we *know*, or put XAP on hold until we manage
to get input from people with real experience in more fields.
it's (mostly) all there for you to read. there's a pure sequencer
engine called 'tse3' out there that is somebody's third go at
the sequencer alone. no matter what its merits are, reading
its source is bound to give you an idea how things come together
in a sequencer (hosted at
sf.net iirc). and there's always muse
which, iirc, also comprises audio.
* Is an explicitly scale related pitch control type
needed?
1. / octave is the politically correct value i guess.
12. / octave is what i am happy with.
since transformation between the two is a simple multiplication,
i don't care much which gets voted.
* Is there a good reason to make event system
timestamps
relate to musical time rather than audio time?
yes. musical time is, literally, the way a musician perceives
time. he will say something like "move the snare to the sixteenth
before beat three there" but not "move it to sample 3440004."
the system should do its best to make things transparent to the
musician who uses (and programs) it; that is why i am convinced
the native time unit should relate to musical time.
i do not think it should be explicitly 'bar.beat.tick', but
total ticks that get translated when needed. this judgement is
based on intuition rather than fact i fear. for one thing, it
makes all arithmetic and comparisons on timestamps a good deal
less cpu-bound. it is simpler to describe. however, in many
algorithms it makes the % operator necessary instead of a
direct comparison. otoh, the % operator can be used effectively
to cover multi-bar patterns, which is where the bbt scheme
becomes less handy.
* Should plugins be able to ask the sequencer about
*any*
event, for the full length of the timeline?
you're perfectly right in saying that all events destined for
consumption during one cycle must be present when the plugin
starts the cycle. i do not think it is sane to go beyond this
timespan here.
however time conversion functions must exist that give
valid results for points past and future with respect to
the current transport time in order to correctly schedule
future events.
* Is there a need for supporting multiple timelines?
this is a political decision, and it's actually a decision you
have to make twice: one -- multiple tempi at the same point,
and two -- multiple ways to count beats (7/8 time vs 3/4 time
vs 4/4 time etc) in concurrence.
being politically quite incorrect, i am happy supporting only
one tempo and one time at the same point. imagine how
complicated things get when you answer 'yes' two times above,
and add to this that i can describe the music i want to make
without (even standard polyrhythmic patterns because they
usually meet periodically).
multiple tempi are really uncommon, and tend to irritate
listeners easily.
* Is it at all possible, or reasonable, to support
sequencers, audio editors and real time synths with
one, single plugin API?
the sequencer definitely needs a different kind of connection
to the host. in fact it should be assumed it is part of, or
simply, it is the host i think.
for simple hosts, default time conversion facilities are really
simple to implement: one tempo and one time at transport time
zero does it. conversion between linear and musical time then
are a simple multiplication.
audio editors, i don't know. if you call it 'offline processing'
instead i ask where's the basic difference to realtime.
real time synths -- wait, that's the point, isn't it? ;)
tim