I think the way to make an inter app modular really useful and new would be
to use Jack for *ALL* signal passing including control signals as CV
emulations by passing a DC signal through Jack ( as say an equivalent of a
Csound krate signal or a hardware modular control volt signal.) If the
modular was some sort of framework for interconnecting apps and plugins
controlled by midi and Jack DC signals, one could actually accomplish the
kind of flexibility possible with hardware modulars and still be able to
incorporate other work.
Iain
----- Original Message -----
From: "Mike Rawes" <mike_rawes(a)yahoo.co.uk>
To: "The Linux Audio Developers' Mailing List"
<linux-audio-dev(a)music.columbia.edu>
Sent: Thursday, January 15, 2004 11:46 AM
Subject: Re: [linux-audio-dev] Project: modular synth editor
  On Wed, 14 Jan 2004 13:04:09 -0500
 Dave Robillard <drobilla(a)connect.carleton.ca> wrote:
   I think
the main reason for this (for SSM at least) is firstly that
 LADSPA support came after the app was developed, and secondly LADPSA
 plugins don't have GUIs - these have to be generated, with limited
 success. SSM is great in this regard - the native plugins all have
 really cool GUIs. 
 Rather than having GUIs to control plugins, I was thinking ports would
 be used for the most part.. real analog modulars don't have nifty GUIs
 :).  The GUI problem can be solved pretty well with a slightly more
 advanced version of what SSM is doing (in CVS) right now... I havn't
 really seen any LADSPA plugins that need more than a bunch of
 sliders/knobs. 
 
 It's when you have several connected plugins forming a single module
 (I'll roll out my old example of an ADSR, a few DCOs, DCA etc) that GUI
 generation really falls down. An alternative is that used in gAlan
 (
http://galan.sourceforge.net), where you build up your UI and hook it
 up to whatever ports you like. Or PD - which lets you do ... just about
 anything.
  The ideas ams/ssm have used are pretty good, but
I would add the
 capability of choosing on an individual control basis whether the
 control was in the GUI, or exported as a port.  This is definately
 something severely limiting about ams for me right now... (sure, you
 can use MIDI out to get around it, but ew).
  Another issue is that LADSPA plugins cannot be
used for sample
 playback, as loading samples is impossible unless you stream the
 data in a la 
SooperLooper(http://essej.net/sooperlooper). 
 This is where the simplicity comes in.. we're not making a sampler :).
 Use an external sampler, if you want the audio in the synth, bring it
 in as audio. 
 I agree - for most cases a separate sampler is fine. Composite
 samples-plus-filters-and-stuff instruments might be trickier, but doable
 (just send the Note On or whatever to both the sampler and the synth).
 [...]
    Anybody with more ideas/criteria for the
 perfect modular synth? 
 OK here I go: From my notes (and fuzzy memory :), what I'd like is a
 simple cli program that communicates via some form of IPC.
 OpenSoundControl (OSC) would seem to be a good mechanism, as it's
 quite widely supported and should be sufficient for all that would
 be required.[chop] 
 
 Making an "interface" or communication protocol, IPC, whatever you
 wanna say I think is way way way overshooting what needs to be done.
 What's the point really?  The overhead is definately not worth it
 (realtime audio..).  The engine should be just code, straight up.  We
 need to design the architechture of the _program_ not an interface. 
 
 The engine would be straight up code. However, the interface would still
 need to be defined as a set of functions for manipulating the engine.
 All the OSC/IPC/whetever bit is for is to allow this manipulation to be
 done by a program that is entirely separate from the engine. It is
 likely that each OSC message (or whatever) would map one-for-one to each
 function.
 The overhead is minimal - it's not being used to stream audio, or
 anything even close to that bandwidth.
  If you really wanted a UI like what you
described, it could be
 implemented as, well, a UI plugin.  This is my idea anyway, if you
 have some good rationale for a system like that please elaborate 
 What I was suggesting was not a *user* interface: I wouldn't expect the
 user to control it directly by sending and receiving OSC messages!
 For an excellent example of what I'm getting at, have a look at ecasound
 (
http://eca.cx) - a multitrack audio engine. This is controlled by the
 Ecasound Control Interface (ECI) which can be used by any number of UIs,
 both graphical and text based, to control a common engine (ecasound
 itself) for a variety of purposes.
 A full modular with a similar architecture to ecasound is what I'd like
 to see.
 , but I think
 you're overthinking. 
 Guilty as charged. I do too much thinking, and not enough doing :)
  I know, I'm really clinging to simplicity
here, but it's for good
 reasons:  simplicity breeds usefulness (UNIX), but more importantly,
 simple means the damn thing will actually get _written_ :). 
 That's the idea - simplicity. I wouldn't want any (G)UI code anywhere
 near the engine. The engine should know nothing of knobs, sliders or
 whatever, just what it receives via its inputs.
 I'm now motivated to get coding :)
 -
 Myk