On Sun, 20 Jan 2019, Will J Godfrey wrote:
Following on, I think our biggest problem is going to be actually getting the
data into a computer. I can't imagine a practical way either ALSA or JACK can
be modified to accept it.
The video does not add to much of what I was aware of. Not too long ago,
the MMA were much more open than they are now and I could browse (without
login) where MIDI2 was at and the format of commands. I have forgotten the
specifics :) and so the video was a good refresher.
Jackd is a data pipe, so is alsa for that matter. Audio is 32 bit (float)
MIDI 2 is 32 bit as well. ALSA (so far as I know) doesn't "mix" or combine
MIDI. Jackd does and so it would still need to know event breaks. I would
think the way forward for both ALSA and jackd would be to use MIDI 2
internally and convert to MIDI 1.0 at the port if needed. That is stream
MIDI as 32bit words and let the applications deal with negotiations with
the other end. 32bit midi works well with aes67 or avb as well, it is just
a different data type. So even though aes3 streams are not included in the
aes67 standard, many of the aes67 boxes will deal with them just fine.
MIDI2 is bdirectional, jackd and alsa are not but they don't disalow it
either. in both cases there would need to be two channels to handle
that... but the reality for a lot of MIDI 1 applications has been
bidirectional since day one. Even with my slightly pre midi 1, midi gear,
there is the idea of the two talking with each other just with two cords
(for speed more than anything).
OSC would only work if the data source was sending it,
otherwise you'd still
need a translation level within the machine (in which case you might as well
work with the protocol directly). The big synth names are not likely to put any
effort into OSC support, as they have already thrown their hats in for MIDI 2.
I was actually surprised that the MMA managed to get both Apple and Microsoft on
OSC as in the video was not sold well to the manufactures... I would say
the real reason is both OSCs strength and it's weakness... you can do
anything with OSC, it is wide open. OSC did not from the start come with
some standard ways of doing things (note on/off/etc. controllers) OSC has
every single use is custom, there are no standards at all, each
application has it's own standard. Quite the oposite to MIDI. I know of
only two DAWs that share the same OSC map and that is Ardour and Mixbus ;)
So there are no physical OSC control surfaces available and even the Glass
counterparts like touchOSC are limited and still require someone to create
a layout for everything they might control even though each application
might have the very same controls.
In many ways, if midi2 offers good standard ways of dealing with mixer
control, lighting, transport control and other automation, OSC may just go
away because MIDI 2 can offer more. It would be easy even now for me to
create a Mackie control protocol to OSC converter... or almost any midi
based controller out there. MIDI2 will only make this easier. MIDI2 is
also more compact by virtue of being streamed. While OSC may be self
documenting, for most controllers, each message/event/command requires a
whole network packet. I tried using "bundles" but found none of the
control surfaces I was trying to support recognised them. The feedback
stream from a daw is quite heavy, A bank change that the control surface
sends as one message requires the daw to send back as many as 1000
messages each one as it's own packet... udp chokes and many messages never
make it there unless a small delay is added between each message.
There are ways around this: Bundles is one, if the surface understands
those. The X32 points to another: each channel only sends one message with
enough parameters for the whole channel... there goes self documenting
messages. I am thinking of offering both as options in Ardour BTW.
Then there is OCA https://www.ocaalliance.com/
(now aes70 I see) which is
much more tightly defined and also interactive like MIDI2. It allows the
surface to find out what the device controls are and create or asign
controllers to those functions. The setup expects to deal with more than
one device at a time. So for example an OCA surface might show an Ardour
strip with a preamp control section at the top that controls the alsa
device directly as if it was all one app being controled. Yes it is
possible to do this with OSC, but the setup is all manual and not all OSC
surfaces are willing to talk to more than one port. MIDI1.0 already
supports more than one channel internally and MIDI2 expands on that while
also offering control negotiation. There was work started on an OCA
library for linux... but the dev seems to have gotten a life (job maybe?).
I've now watched that vid. a second time (very
much recommend looking at it if
you haven't already) and I'm even more impressed with the way they've
the new extensions. Also, to some degree they've split the more
aspects away from the more musician/performance focused ones.
My guess is we've got about 2 years to get up to speed before source instruments
become mainstream. Although I'd like to be involved myself, I really don't think
I've the skills to add anything useful :(
There are a number of "new" things that Linux is lacking. ALSA in
particular I suppose. I would list:
RTPMIDI - old code exists, but I can't make it work
AES67 - I believe there is something started
AES70 - See above
AVB - there are some linux bits of this (connect jack on two
Bluetooth (in alsa) - There used to be BT in alsa now only in
and now MIDI2 - spec not ready
a better way of dealing with HDMI audio in alsa would be nice
hdmi audio for many people does not work with even medium
low latency as it needs a 4k buffer.
a better way of dealing with HDA audio. Pulse does this well, but
jack and other audio applications that deal directly
with alsa do not. For example try opening an HDA device
in jack or Ardour with more ports than two (most support
6 or 8 output channels)
Just as a short list. USB2.0 audio has been a gift and a curse to linux
audio. It has allowed almost all audio interfaces to work with Linux out
of the box (Thank you Apple!). It has also taken away from Linux the push
to deal with new audio setups... linux audio is working no need to mess
with it. Linux needs a new generation of developers to catch up with this.
Either by adding these things to alsa, or with something new. Basically,
if no one works on MIDI2 infrastructure, midi2 will only be used as direct
usb drivers to applications. Honestly, Linux MIDI handling is kind of a
mess right now anyway... maybe midi2 is a blessing.
I sometimes wonder if all audio in Linux should be treated as if it was
network audio following either aes67 or avb as a standard even internally
and provide the library functions to do so. Set the base latency at 1ms
(works with aes67 and avb) and allow the library to give the application
whatever latency it wants. Because every application would be an endpoint,
jack like routing is almost there (jackd also allows mixing two streams
into one port) by default.
I also wish I was 30 years younger with todays knowledge...