Paul Davis wrote:
Actually,
I just started from the following assumption : a midi
hardware controller is a input device, among keyboard, mouse and
others.
the problem is that its also an output device, for some midi h/w. add
this into the situation, and you start to realize that the midi h/w is
actually a "GUI" just like the one on the screen. forcing one GUI to
be mediated by another seems odd.
Okay, let me try to improve this...
Assumption v0.2 : a midi hardware controller is a set of input and
output devices, and is in its nature identical to the more
conventional monitor, keyboard and mouse.
And because I like ascii diagrams, here is where this assumption drives me :
+----------------+ +--------------+ +--------------+
| input & output | <----- X ------> | GUI Toolkit | | Application |
| devices | <---- Midi ----> | (gtk, etc..) | <---> | (MVC or not)
|
| | <-- Whatever --> | | | |
+----------------+ +--------------+ +--------------+
About configuration : there are tools to map keys to letters for a
keyboard, so there could be tools to map knobs to controllers in case
of a midi box.
This is all theoritical, but can it be considered false ?
It at least would hurt users like me. If I already have an alternative
input/output device like a MIDI controller, I do not want to have to
run a GUI as well, since I actually wouldnt even see it.
In my mind, the application should be the engine/model, and all
views/controllers should be independent of each other. Having to go
through the bloat of running GTK or something similar is just silly if
what someone wants is to just send/receive MIDI data.
--
CYa,
Mario