[LAU] MIDI Input of LAU Applications
pshirkey at boosthardware.com
Mon May 18 05:58:43 EDT 2009
> On Wednesday, May 6th 2009 04:24:52 Patrick Shirkey wrote:
>> Crypto wrote:
>>> Hi @LAU people,
>>> I would like to open a discussion about the MIDI capabilities of LAU
>>> applications here.
>> I htink everyone would aggree that midi support for devices is not as
>> user friendly as it could be. It's something that has not had alot of
>> attention because there are simply not enough people with devices who
>> are prepared to spend time on it at the moment.
>> Have you tried working with osc to connect your device?
>> Which specific apps are you using to connect the device with and what
>> are the specific issues that you see? If we strip it back to specific
>> issues then we should be able to use your setup as a test bed of sorts
>> to get round them or give the original devs a heads up.
>> Patrick Shirkey
>> Boost Hardware Ltd
> thx for your reply here.
> I have had a look at the OSC webpages and it seems to me that OSC is not what
> I need. It appears to exeed the abilities /overcome limitations of MIDI to
> some degree, but if an application does not implement OSC features I am stuck
> again (or if that is not true than I have not yet found about how to control
> any application using OSC, a setup in which OSC acts as a MIDI-to-pc-command
> This is what I have and what I would like to to with it:
Very cool idea.
So from your description below, you have some trouble getting "some"
midi messages to be received by various applications?
Hydrogen is being actively worked on and I'm sure you would get a quick
response from the team by posting this specific issue of the message for
the fill in event not being understood/received.
It's possible that no one has thought to implement this but that it is
actually a small step with a couple of lines of specific code for that
feature connected to a specific function. I assume that Hydrogen/H2
already supports fill-ins with a gui control so attaching a specific
midi message to that control will be a straightforward step if you can
provide the sysex message and/or assist with testing.
I agree with you that we are sorely missing a universal method for
learning the sysex of the available controller. A daemon app that looked
after this and provided a standard translation for every app to access
would be ideal. So far it has been left to the devs to implement the
midi protocol directly via alsa/oss into each app. Recently things have
improved with jack midi but it is still being road tested as it is just
a couple of years old.
I'm not sure if anyone is working on a daemon for learning and
translating midi control messages. Perhaps someone could start by taking
the specific code out of Ardour or maybe if enough people put a bounty
on it Paul would consider splitting it out himself for everyone to use...
It's a pretty big project and would need a commitment of several months
to make it truly accessible. But we do have the building blocks in place
if someone wanted to make some headway.
Boost Hardware Ltd
> Following this lenghty link you can find a short description and a photo taken
> of my MIDIfied formerly pure analogue electric organ. I have removed all
> analogue circuitry that was inside of it previously and replaced it with a µ
> controller operated MIDI board. This board can evaluate incoming events like
> pressing a button, turning a knob and moving a drawbar and convert them into a
> MIDI message.
> What I would like to do in a first place is "simply" control MIDI applications
> that run on my notebook via these MIDI messages. Or, if you will say it like
> that: I need a MIDI link from my MIDI organ to the applications running on my
> notebook. The commands are sent after triggering/switching/moving the turning
> I have already prepared a testing setup for controlling the B4II virtual organ
> and in this case I can see drawbars moving on the notebook screen after moving
> the hardware drawbars of my MIDI organ. I need similar behaviour/environment
> with all other LAU applications. I need them to be controllable via MIDI in
> the same way as they already are controllable via computer keyboard and mouse
> operation on the GUI.
> I would like to do similar to the B4II stuff with a drum application, such as
> starting a rhythm and stopping a rhythm playing, and also play a fill in when
> pressing a button, play an ending, have single finger chord recognition,
> trigger single drum sounds. Similar to the moving drawbars of the B4II GUI it
> means in this case that after pressing the hardware button on my MIDI organ a
> MIDI message (typically "start") would make a drum application start a
> particular rhythm to play (or a pattern, whatever). In that case it could well
> mean that one sees an virtual button be pressed on the GUI of the drum
> So far I have not found any application that would be capable of doing all of
> that, unless I have the opportunity of doing the programming myself. For that,
> the only MIDI language environment I have found is keykit.
> Keykit has other things that still need resolving (I have found it cannot yet
> handle more than one MIDI input on linux machines), but it can do e.g.
> realtime operation, deal with MIDI data in a programming like environment and
> it can even run system commands in a shell. Unless other applications take up
> with the MIDI track again I have planned to use keykit as a filter for incoming
> MIDI events and even program my own drum computer with it.
> I was stuck by the fact that an otherwise great application like hydrogen is
> unable to perform a simple thing such as playing a fill-in via MIDI, although
> it offers plenty of ways to play all sorts of MIDI sequences (I compare a fill-
> in to a sequence here). I know that hydrogen offers some commands such as
> "start" and "stop", but that is not enough for what I would like to do.
> hydrogen is like Edward Scissorhands without the human hands he should have
> received. hydrogen could do it if it only had a more advanced MIDI handling.
> Or to put it short:
> Play a sequence as a fill-in: use computerkeyboard or mouse to start playing
> How can I do the same thing in a live play environment from my MIDI organ?
> Fiddling with mouse and notebook keyboard is no useful option.
> I am not asking for more features some applications already have, because I am
> sure that LAU applications really have matured and become feature rich - I am
> asking for a way to use these features by triggering them via a MIDI command.
> That would enable live musicians to use their tools live, while playing,
> without any staring at a computer screen.
> I understand that some things are not defined by MIDI specification and that
> could be the reason why there is no "fill in" option in hydrogen, to stay with
> this example. But then: the B4II software which I mentioned above has all
> kinds of controller messages overriding the standard MIDI definition. So I
> guess it could be well possible to implement one's own MIDI controller
> messsages into an application (if that really were necessary and there were no
> standard MIDI message for doing that), otherwise SYSEX messages, NRPN...
> Kind regards,
> Linux-audio-user mailing list
> Linux-audio-user at lists.linuxaudio.org
More information about the Linux-audio-user