- a delay
line,
- allowing high-quality fractional-sample delays,
- at least 12 outputs, for each two controls: delay, gain,
- smooth 'crossfading' between two control sets, both delay
and gain, controlled by a GUI or by OSC.
It should not take more than 20% CPU on a 2G P4
(other things have to run at the same time).
If you know how to get this faster than by actually
writing it in C++, please let me know !
I think, Pd is faster to write such a thing. A version is attached. I
omitted the crossfading and OSC control as that requires some
externals or abstractions. But it's not hard to do as well.
In pd (or other rt-alternatives) you do that in 5 minutes and in C/C++
you need a day or more ...
So the most important lesson I had to learn when prototyping
algorithms+ideas: don't use C or C++ ! Only if there is really no other
solution (which is not often the case) !
And don't think about optimization too early ... ;)
BTW: I'm using python (scipy/numpy) for algorithm prototyping (sometimes
I have to use octave/matlab, because its somehow a "standard" in
DSP-world) and for writing/testing realtime algorithms pure data.
LG
Georg