On 6/28/05, Olivier Guilyardi <ml(a)xung.org> wrote:
1 - a friend of mine showed me his brand new evolution
uc33 midi controller.
He's using that with a windows-based software called "Live". I was really
suprised to see that this software use exactly what I proposed in this thread.
First you enter a "capture-mode", you click on some GUI widget, then you
rotate
some knob, that's it : it's assigned. I promise I never saw a such thing when I
had exactly the same idea.
I think everyone would agree this is a Good Thing.
2 - Tonight, I went to a concert of a jazz drummer, a
friend of mine. He was
using a macintosh coupled with a midi h/w controller. He spent more time playing
with this computer than with his drumset, but nevermind... My point is : he was
using both the midi controller, and the screen/keyboard/mouse set, just as my
other friend above.
I don't see what this has to do with anything.
Now, here's what I consider to be a very practical
consideration : there are
many GUI enabled apps, which can't run headless. A usual way of using a midi h/w
controller is as a add-on, not as a replacement for the screen. I'd personally
like to use a such midi h/w as a standalone device, on stage, but nothing will
forbid me to unplug the screen in this case...
Never mind headless; it might be useful to run without X (hence no
GUI) on older PCs.
By "midi-enabling" some toolkit widgets,
_many_existing_ apps would suddenly
become compatible with these dedicated controller hardware devices. I do agree
you that the best would be for these apps to add some midi input support, as a
separated "View".
But on the other side, what about the "capture-style" way of assigning knobs
to
widgets ? Don't you see how this is efficient ? In 1 second, what you touch on
the controller is connected with what you see on your screen (WYTIWYS ;-).
Still, of course, in this case you may say that these are two separated layers
that artificially appear to be one... But what about a shortcut to couple these
two layers, if they are to get so tight ?
I see no reason why this couldn't be done with an MVC-like architecture:
1. Your (MVC) Controller enters "capture mode"
2. you move a slider in your GUI whichs sends a message to the MVC Controller
3. you turn a knob on your MIDI controller, which sends a message to
the MVC controller
4. MVC controller binds the two
But I should shut up because I've never done the above.
I believe there could exist a library with which :
1 - you instantiate a core object (providing the alsa midi port as an arg)
2 - you "attach" to some widgets : sliders, spin buttons, etc.. (note that
this
is different from extending (bloating) widgets)
3 - you may call a function to enter the capture-mode
4 - 100 % of this capture-mode is encapsulated by the library : knobs-to-widgets
assignations are handled transparently
5 - there is some way to retrieve these assignations to recall them later
You seem to really like this idea. Why don't you just do it and see
if it works well? I have an unfounded hunch that it won't, since you
usually want your GUI running in a lower priority thread.