[linux-audio-user] Re: [linux-audio-dev] music engine

Dave Robillard drobilla at connect.carleton.ca
Wed Apr 12 12:40:45 UTC 2006


On Tue, 2006-04-11 at 12:10 -0700, Kjetil Svalastog Matheussen wrote:
> Paul Davis:
> >> As an interface designer, the first thing I look for on an engine's
> >> project site is some sort of asynchronous API - I should never concern
> >> myself with anything outside referencing the api from my app's one
> >> windowing thread. FMOD, gstreamer, and my dead pkaudio project do this
> >> very well. I don't ever want to worry about what thread it happens in,
> >> what threads it will affect, or what the performance effects of
> >> *making* the call will be (as opposed to residual effects).
> >
> >although i agree that this is the right design for many classes of
> >application design, i would like to see how you propose to tackle
> >metering and waveform display (the two most difficult examples).
> >ardour would be relatively easy to separate into interface+engine
> >processes (as opposed to just a lib/lib-client separation) if it were
> >not for these issues. moving waveform and metering data back and forth
> >between two processes via a wire protocol is very expensive and
> >inefficient.
> 
> How about creating the gfx waveform data on the engine and then just
> send a pointer to the gfx process, which displays the data using the 
> MIT-SHM X extension?
> 
> PS. Please don't let this inspire you to reimplement Ardour. :-)

Actually Lars Luthman's DSSI Oscilloscope plugin does exactly this.

-DR-




More information about the Linux-audio-dev mailing list