On Monday 12 February 2007 21:34:08 Stefano
D'Angelo wrote:
Well,
that's our intent in CLAM[1]. The goal is that CLAM should be able
to run a given processing algorithm transparently under several backends.
Currently we support, to some extend, PortAudio, Jack, Alsa, and VST. The
first three backends can be used with a Qt Designer interface. We still
have to face several fronts: Unifying the interface to fit all the
backends, incorporating more backends (some work on Ladspa has been
done), and enabling Qt GUI's to more backends (notably VST).
Well, I think I've not understood what you mean: Jack, Alsa and
PortAudio are not sound processing plugin formats... can you explain
it easier please? (I'm sorry, I'm not a native English speaker)
They are not plugin systems, you're right, but if you have a processing
algorithm encapsulated in a way that it describes itself (number of ports,
controls... then you can build wrappers (what we call backends) that maps
this algorithm to a given plugin system (ladspa, vst...) but also to a given
underlaying audio system api (portaudio, alsa, directx..)
For example, if i want my algorithm to be a Laspa plugin, i could use a Ladspa
backend, that compiles as library, that maps connection topology of the
processing algorithm to a Ladspa descriptor and when it is called, it just
feeds data to and from the ports and executes the algorithm. A Jack backend
should be very similar but it compiles as an application and publishes the
ports as Jack ports to the server. And so on. Of course, there is a lot of
work on handling each backend singularities such as JACK server availability,
api/device enumeration in portaudio...
In short, the intended result is that the developer designs the algoritm once
and then it can be used as any kind of plugin (what you asked for, didn't
you?), and not just plugins but also standalone application.
We also provide a way of relating a qt interface to certain parts of the
algorithm. This is already available for standalone applications backends[1]
(JAck, PA, Alsa..) but we want to provide that also for plugin systems such
as VST.
[1]
http://iua-share.upf.es/wikis/clam/index.php/Network_Editor_tutorial
I hope to have explained myself better, but feel free to ask me more
information if you are interested in.
David.
Mmmm... I think we are interested in two opposite things: I want an
host to use any kind of plugin without having to know which kind it
is.
For example I have some LV2, some VST and some LADSPA plugins. The
wrapper I'm talking about would be able to interface with them and let
my host use any of them (as it was a gstreamer for audio plugins).
Instead you want that a plugin/application writer describes its
algorithm and it can be "traslated" in a LADSPA plugin, as well as a
JACK application, etc. Am I right?
But, anyway, maybe combining the two things could be of some interest:
imagine that you want to be able to develop and use immediately in all
supporting applications a plugin system capable of using the
z-transform. In this way you could build a module for this wrapper and
soon start programming your plugins and use them, without having to
wait for the adoption of "your standard".
Also, this way some noticeable improvements can be made on performance
if this wrapper would be able to represent processing networks which
can be "simplified", as for example a net of LTI systems with known
transfer function (fourier transform).
What do you think about it?
Regards,
Stefano D'Angelo