As an
interface designer, the first thing I look for on an engine's
project site is some sort of asynchronous API - I should never concern
myself with anything outside referencing the api from my app's one
windowing thread. FMOD, gstreamer, and my dead pkaudio project do this
very well. I don't ever want to worry about what thread it happens in,
what threads it will affect, or what the performance effects of
*making* the call will be (as opposed to residual effects).
although i agree that this is the right design for many classes of
application design, i would like to see how you propose to tackle
metering and waveform display (the two most difficult examples).
ardour would be relatively easy to separate into interface+engine
processes (as opposed to just a lib/lib-client separation) if it were
not for these issues. moving waveform and metering data back and forth
between two processes via a wire protocol is very expensive and
inefficient.
How about creating the gfx waveform data on the engine and then just
send a pointer to the gfx process, which displays the data using the
MIT-SHM X extension?
PS. Please don't let this inspire you to reimplement Ardour. :-)