1) Is the sync properly implemented in a sequencer
application, or does ALSA
provide a useful framework for some kind of general solution? I believe that
the ALSA sequencer does not do this. it could probably be coerced into
doing so, but it wouldn't work correctly on kernels pre 2.5.6X or
so. the "scheduling" requirements for delivering MTC are impossible to
satisfy in earlier kernels without patches (and not the low-latency
patch, but others).
both Ardour and Muse have MTC implementations built
into the applications
(which seem to have rotted at this point; it doesn't seem to work for me).
Why was this, and what are the alternatives?
see above.
2) The 2000 thread discussed how bad SMTPE is for audio
sync. Unfortunately,
SMPTE/MTP seems to be the standard that is implemented in hardware devices
that we can actually buy. There is probably no point in beating that dead
horse again. It is what we have.
SMPTE/MTC has a resolution of (typically) 1/30 second. it therefore
cannot be used for sample-accurate positioning. this will still ensure
that everything is in the same place, but it may not be exactly where
you want them to be.
3) Does Jack have a role to play in syncing
applications and external hardware
together?
only applications.
see jack/transport.h, but note that it is still evolving. there has
been lots of discussion recently on jackit-devel about the
representation of musical time.
4) Does SMTPE/MTC have a role in software to software
sync, or just syncing to
external hardware?
i hope just the latter.
note: ardour will probably shift to use SMPTE for external h/w sync
soon. SMPTE is an continuous audio signal, and as such it can be
handled precisely without any requirement for precise kernel
scheduling. MTC will be available in slave mode on all kernels, in
master mode on kernels with the new clock support present in the
2.5.6X series.
--p