On Jul 24, 2006, at 7:43 AM, Dave Robillard
<drobilla(a)connect.carleton.ca> wrote:
Anyway, as soon as you go sysex you lose the semantics
and you have
the same
problem anyway - retransmission is the only possible solution if you
know nothing about the data (it becomes application specific).
RTP MIDI has several ways to deal with this. For senders that know the
semantics of what they are sending (like, say, Novation would if they
were
adding Wi-Fi to their keyboard line), the recovery journal syntax for
SysEx
lets the sender specify recovery data in a way that's suitable to the
semantics,
and this encoding lets a receiver figure out how to use the data to
recovery.
For senders that don't know the semantics of what they are sending
(like a box with MIDI DIN jacks on one end and a WiFi antenna on
the other), there are several options. One is to use the recovery
journal
encoding for SysEx that is a simple list of all commands for a type
of SysEx, and rely on more frequent RTCP feedback from receiver
to sender to keep the journal trimmed to a reasonable length.
Alternatively,
It's possible to split a MIDI cable into two RTP MIDI streams -- one
TCP and
one UDP -- and gate the SysEx onto the TCP stream.
(especially custom sysex evil that throws
interoperability completely
out the window).
Most industry folks who need to do unusual things with MIDI don't start
with SysEx. They start by making analogies of what they need to
do with the standard MIDI command set, and repurposing. This is
partially done to make sure DAWs can edit the data, and partially
done to get the efficiency of running status over the wire. SysEx
is used for secondary features. You can see this design philosophy in
the Logic Control specification in Appendix B of:
http://manuals.info.apple.com/en/Logic7_DedicatedCntrlSurfaceInfo.pdf
If I were rewriting an OSC application to use MIDI, with an eye
towards good RTP MIDI loss behavior, I'd take this re-purposing
approach ... it would be curious to see how Jazzmutant did it,
since in their latest release of Lemur MIDI is now a full-fledged
transport and not sent via OSC, if I read this web page correctly:
http://www.jazzmutant.com/lemur_lastupdate.php
Human readability and interoperability IS often
important (eg using supercollider or pd or whatever to control
things).
I use Structured Audio in my own work, and Eric's Scheirer's language
support design for MIDI has many good aspects. See:
http://www.cs.berkeley.edu/~lazzaro/sa/book/control/midi/index.html
In 2006 if I were designing a replacement language, I'd do the MIDI
interface language design differently, given my experience using and
implementing SAOL. But I don't consider MIDI's use of SAOL as
hard to program in its present state, apart from some details of
extend() and turnoff for handling NoteOff release activities.
---
John Lazzaro
http://www.cs.berkeley.edu/~lazzaro
lazzaro [at] cs [dot] berkeley [dot] edu
---