Hi Paul,
Paul Davis wrote:
+----------------+ +--------------+ +--------------+
| input & output | <----- X ------> | GUI Toolkit | | Application |
| devices | <---- Midi ----> | (gtk, etc..) | <---> | (MVC or not) |
| | <-- Whatever --> | | | |
+----------------+ +--------------+ +--------------+
the model you've given above is just unwieldy for this. the update
frequency required to get smooth fader motion (part of the "GUI" of the
MIDI h/w) is very different than is required for the screen GUI.
My diagram was not clear enough. I did not mean that gtk should talk to the midi
h/w in the same way it talks to the screen. The refresh rate you're talking
about is one among many things in which a midi h/w device differ from a
screen/keyboard set.
The thing is just that by coupling the midi input and the ui toolkit layer, one
can take an interesting shortcut. More below.
it should look like this:
+------------+ +----------+
| Screen GUI |<----------->| "bridge" |<--------+
+------------+ +----------+ |
+<-----> MODEL
+------------+ +----------+ |
| MIDI (G)UI |<----------->| "bridge" |<--------+
+------------+ +----------+
the "bridge" component is just some s/w that converts between whatever
the communication mechanisms are for the model/software and software/UI
links. for the screen GUI, its something written using some kind of GUI
toolkit; for the MIDI system its something that understands how to talk
MIDI as well as the specific control surface protocol.
this keeps the code nice and clean, allows you to add new
controllers/views without modifying the existing controllers/views, and
seems to me to be a no-brainer.
I agree, it' clean. Now, let me describe what I saw the last couple of days :
1 - a friend of mine showed me his brand new evolution uc33 midi controller.
He's using that with a windows-based software called "Live". I was really
suprised to see that this software use exactly what I proposed in this thread.
First you enter a "capture-mode", you click on some GUI widget, then you rotate
some knob, that's it : it's assigned. I promise I never saw a such thing when I
had exactly the same idea.
2 - Tonight, I went to a concert of a jazz drummer, a friend of mine. He was
using a macintosh coupled with a midi h/w controller. He spent more time playing
with this computer than with his drumset, but nevermind... My point is : he was
using both the midi controller, and the screen/keyboard/mouse set, just as my
other friend above.
Now, here's what I consider to be a very practical consideration : there are
many GUI enabled apps, which can't run headless. A usual way of using a midi h/w
controller is as a add-on, not as a replacement for the screen. I'd personally
like to use a such midi h/w as a standalone device, on stage, but nothing will
forbid me to unplug the screen in this case...
By "midi-enabling" some toolkit widgets, _many_existing_ apps would suddenly
become compatible with these dedicated controller hardware devices. I do agree
you that the best would be for these apps to add some midi input support, as a
separated "View".
But on the other side, what about the "capture-style" way of assigning knobs to
widgets ? Don't you see how this is efficient ? In 1 second, what you touch on
the controller is connected with what you see on your screen (WYTIWYS ;-).
Still, of course, in this case you may say that these are two separated layers
that artificially appear to be one... But what about a shortcut to couple these
two layers, if they are to get so tight ?
I believe there could exist a library with which :
1 - you instantiate a core object (providing the alsa midi port as an arg)
2 - you "attach" to some widgets : sliders, spin buttons, etc.. (note that this
is different from extending (bloating) widgets)
3 - you may call a function to enter the capture-mode
4 - 100 % of this capture-mode is encapsulated by the library : knobs-to-widgets
assignations are handled transparently
5 - there is some way to retrieve these assignations to recall them later
--
og