[LAD] handling midi input in a jack app?

Iain Duncan iainduncanlists at gmail.com
Wed Dec 28 19:54:39 UTC 2011


On Wed, Dec 28, 2011 at 11:35 AM, Paul Davis <paul at linuxaudiosystems.com>wrote:

> On Wed, Dec 28, 2011 at 1:46 PM, Iain Duncan <iainduncanlists at gmail.com>
> wrote:
> > Hey folks, what is the easiest way to deal with midi input in a jack app?
> > I'm confused by the difference in jack midi and alsa midi, because I have
> > two midi inputs, one is a usb input, so it appears at a low level as an
> alsa
> > device, but the other is the midi input on a firewire unit, and it
> appears
> > as a jack midi device. I'd like to make sure that whatever I do is easy
> to
> > port to other systems. Does it make sense to use portmidi or rtmidi to
> get
> > input or should I stick to the jack api entirely?
>
> JACK's MIDI API is substantially different from all others in that you
> receive MIDI data in the same thread (and same callback) that you
> receive audio data. In this sense its much more like the APIs used by
> plugin APIs to retrieve MIDI for use during the plugin API's
> equivalent of JACK's process() call.
>
> None of the other APIs that you've mentioned have this property, and
> nor do any of the Windows MIDI APIs or CoreMIDI.
>
> this means that you face opposing issues depending on which API you
> choose to use:
>
>  * if you use JACK:
>        - MIDI data is trivially available to alter synthesis done
> during process()
>        - MIDI data needs to be moved across thread boundaries to be
> useful outside of process()
>
>  * if you use ALSA, portmidi, rtmidi, CoreMIDI or anything else
>       - MIDI data to be used for synthesis has to be moved across
> thread boundaries
>       - MIDI data used for other purposes can often be used in the
> same thread it was received in,
>                 though not always.
>
> this issue is far more substantive than the question of whether you
> can access a firewire-based MIDI port or a USB-based MIDI port.
>
>
Thanks for that explanation. In my case, I believe I will have two kinds of
midi input, one that would be best served by the first set of tradeoffs and
the other the second, namely, the user may be playing a synth, or the midi
input may be used to control the app. Is it reasonable to use both jack
midi and a non-jack midi api in the same app with different midi input
devices?

thanks
Iain

> --p
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linuxaudio.org/pipermail/linux-audio-dev/attachments/20111228/76d818c4/attachment.html>


More information about the Linux-audio-dev mailing list