[linux-audio-user] Re: [linux-audio-dev] music engine

Paul Davis paul at linuxaudiosystems.com
Sat Apr 8 03:26:54 UTC 2006


On Fri, 2006-04-07 at 14:23 -0800, Patrick Stinson wrote:
> As an added note to my previous comments, I really like the app
> interface that mpd uses. Writing ascii events using some spec or
> another to a file descriptor (socket in mpd's case) seems to be a
> terrific way to communicated with apps and libs. I run the svn
> pksampler like this:
> 
> pksampler | pkaudiod
> 
> I could just as easily have either use a socket or some other pipe,
> and this is easy to debug, script, profile, etc. On the client and
> server sides, I wrap my 10-minute rpc into an interface with generic
> calls. Everything having to do with input (MIDI, OSC, whatever)
> happens in the app code. The interface functions resemble those of
> FMODEx's interface; add sample, connect to channel, set sample attrs,
> etc.
> 
> As an interface designer, the first thing I look for on an engine's
> project site is some sort of asynchronous API - I should never concern
> myself with anything outside referencing the api from my app's one
> windowing thread. FMOD, gstreamer, and my dead pkaudio project do this
> very well. I don't ever want to worry about what thread it happens in,
> what threads it will affect, or what the performance effects of
> *making* the call will be (as opposed to residual effects).

although i agree that this is the right design for many classes of
application design, i would like to see how you propose to tackle
metering and waveform display (the two most difficult examples).
ardour would be relatively easy to separate into interface+engine
processes (as opposed to just a lib/lib-client separation) if it were
not for these issues. moving waveform and metering data back and forth
between two processes via a wire protocol is very expensive and
inefficient.

--p





More information about the Linux-audio-dev mailing list