On Sat, Feb 16, 2013 at 02:13:33PM -0500, Paul Davis wrote:
On Sat, Feb 16, 2013 at 12:21 PM, Pedro
Lopez-Cabanillas <
pedro.lopez.cabanillas(a)gmail.com> wrote:
There are other use cases for MIDI that don't involve soft synths, or even
don't involve music at all. What I find laughable is the arrogance of
pretending that everybody fits a single use case.
it certainly would be laughable if someone were to do that. but it appears
that this is yet another de-personalized attempt to point the finger
specifically at *me*. if so, then i find that a bit odd, given that i have
repeatedly explained right here on this list, even with the last few
months, the pros and cons of ALSA vs. JACK MIDI for different purposes.
I've been reading this thread with constantly rising levels of
amazement and disbelief, not only because of the rather pointless
effort of trying to blame Paul Davis for a problem that simply does
not exist, but also because the core of the matter regarding Jack vs.
Alsa MIDI hasn't been touched on at all.
What the Alsa sequencer tries to do is to deliver MIDI events 'at the
right time'. This only makes sense if the destination is a hardware
MIDI device, one in which timing of events is expressed by *physical*
time. If the destination is some audio processing code, and even more
so if both source and destination are audio processing apps, this is
exactly what you do *not* want or need.
No matter if the audio apps are using Jack, or Alsa or whatever other
interface, audio data will be processed in blocks of samples, and the
exact time at which this processing takes place is unrelated to the
time of any events, MIDI or other, that are supposed to determine what
is being done by the audio code.
Take the simplest case of a Jack app: audio is processed per period,
processing a period will typically take a time much shorter than the
period time, and it can happen at any time between the start of the
current period and the start of the next one. It just doesn't matter
when and you can't control it. That means that within each period,
time can be represented only by data - timestamps in some form - not
by physical time. Any interface, e.g. the Alsa sequencer, that tries
to deliver events 'at the right time' just complicates things no end.
In fact it's quite unusable in this context, unless you ignore the
'exact' timing and accept up to period of jitter. What Jack MIDI does
is avoiding to relate to physical time - 'time' is represented and
transferred in the form it is available at the source and required
at the destination - as *data*, with conversion to 'real' time being
done only at the physical interfaces.
There is a similar problem with OSC: to be used by audio processing
code timed events should not be released at some exact time (as liblo
does IIRC), but as data timestamped in relation to the Jack period.
Ciao,
--
FA
A world of exhaustive, reliable metadata would be an utopia.
It's also a pipe-dream, founded on self-delusion, nerd hubris
and hysterically inflated market opportunities. (Cory Doctorow)