Peter Brinkmann wrote:
  I have a sequencer application that has several MIDI
output ports,
 each connected to some device that accepts MIDI input. Those
 devices may have vastly different latencies ... but of course I
 don't want to hear a time lag between those devices.
 I'm currently scheduling all my events through one queue (is that the
 recommended method? I've been wondering whether it would make more sense
 to have, say, one queue per output port, but I don't see how this would
 help), 
Using multiple queues doesn't make sense except when you wanted to use
different timer interrupts for them.
  and the only solution I have been able to think of is
to explicitly
 schedule events for faster devices at a later time. This is clumsy, and
 it's exacerbated by the fact that I'd like to schedule events in terms of
 ticks rather than milliseconds. Since latencies are usually measured in
 milliseconds, that means I have to convert them to ticks, considering
 the current tempo of the queue. There's gotta be a better way.
 Ideally, there are two things I'd like to do:
     1. Assign a delay dt to each output port, so that an event scheduled
     at time t0 will be sent at time t0+dt. Like this, I could compute the
     maximum latency of all my devices, and the output port attached to a
     device would get a delay of (max latency - latency of device), so
     that everything would be in sync. 
This would just move the ms-to-tick conversion into the ALSA
sequencer.
      2. Automatically determine the latencies of the
devices I'm talking
     to. In theory, this should be possible. For instance, if timidity is
     connected to jack, it could get jack's current latency, add its own
     latency, and report the result. Is this science fiction? 
This doesn't help with external hardware synthesizers; you'd have to
measure how soon after a note-on command its analog output shows a
signal.
Regards,
Clemens