Cedric Roux wrote:
----- "Dave Phillips"
<dlphillips(a)woh.rr.com> wrote:
Btw, all the sounds in this piece are originally
from the 8mbgmsfx.sf2
ok, so just to be sure, there is no real instrument
in there? No real flute, horn, bassoon, whatever
recorded with a microphone?
That's correct, although I assume the soundfonts were made from samples
of acoustic instruments.
Here's the MIDI sequencing toolchain:
sequencer --> qsynth --> jack-rack --> system_out and/or ecasound,
ardour, qtractor, whatever
QSynth uses the 8mbgmsfx soundfont, JACK-Rack employs the CAPS Versatile
Plate Reverb, and I have a script that runs ecasound in its interactive
mode for handy recordings. I also have a script that sets up the
software and its connections (thanks to QJackCtl's PatchBay). It's all
very fast.
I make heavy use of Csound's Waveguide filter as implemented in the
AVSynthesis program. It's a powerful tool for manipulating sound, and
I'm far from exhausting its possibilities. In fact, I'm far from
exhausting the possibilities of AVSynthesis.
The computer has also a little camera. Maybe I'll
use that to control music with my body in the air
(like wearing red gloves for easy detection or I
don't know) and I was wondering if I could have
other more or less realistic instruments (I think
about cello, which I like a lot) that I could control
that way.
You might want to investigate Csound, SuperCollider3, and the Processing
environment. IIRC they all have tools for exploring sensor control. And
of course ask around on this list, I think some people here have worked
with motion activation and other sensor control.
Thanks for sharing cool music and happy music!
Thank you, and please keep us informed about your Linux music adventures. :)
Best,
dp