Nick Copeland wrote:
> [snip, because you sent your reply off-list, but I guess this should
> be send to the list too]
If my broken English doesn't fool me, than the more I learn about cv,
the more I guess it's a bad idea to import it to Linux.
Fons: "Another limitation of MIDI is its handling of context, the only
way to do this is by using the channel number. There is no way to refer
to anything higher level, to say e.g. this is a control message for note
#12345 that started some time ago."
I don't know how this will be possible for cv without much effort, but
assumed this would be easy to do, than there would be the need to record
all MIDI events as cv events too, right? Resp. Linux than only would
record all events as cv events and apps would translate it to MIDI.
I'm asking myself, if cv has advantages compared to MIDI, what is the
advantage for the industry to use MIDI? Ok, when MIDI was established we
had another technology, e.g. RAM was slower, more expensive etc., today
we e.g. have serial buses that are faster than parallel buses, so
thinking about reforming MIDI or of having something new is suitable.
HD protocol? RTP-MIDI? USB-MIDI? Some keyboards do have new ports, e.g. USB.
For the proprietary software on MacOS and Windows there are some
advantages that are needed for Linux too. Now you coders say that this
is true, but it's better to realise it in a different way, so Linux
would get advantages compared to the proprietary software, but won't
have disadvantages any more.
Sounds nice in theory, but in praxis I don't believe that this is true.
There is fierce competition between proprietary software developers, why
don't they use cv for their products? Because they are less gifted than
all the Linux coders?
I hope the Linux coders don't lose sight of hardware mainly based on
MIDI. If I buy keyboards, effect processors etc. this hardware is
compliant to industry standards. It doesn't matter if those standards
are good or bad, I need to be able to use affordable equipment. At the
moment Linux on my computer and on computers of around 30 other people I
know can't use hardware MIDI equipment because of MIDI jitter. On the
same machines there is less jitter for Windows, so using Windows would
solve this problem for most of them. For musicians not having this
jitter issue, there are still other issues, e.g. if you loop play a
passage again and again the events send internal Linux and the events
send to external hardware diverge in the timeline, for standard play
everything is ok.
Even if this is a PITA for me, I stay at Linux. Musicians now need to
know which way Linux will go? Are coders for Linux interested to take
care of such issues, or do they want all musicians to buy special Linux
compatible computers, instead of solving issues like the jitter issue
for nearly every computer? Are they interested in being compatible to
industry standards or will they do their own thing? An answer might be,
that Linux coders will do their own thing and in addition they will be
compatible to industry standards. I don't think that this will be
possible, because it isn't solved now and the valid arguments are time
and money right now, so how would implementing a new standard defuse the
situation?
It seems to be, that Rui will enable automation by using MIDI and saving
the automation to MIDI files for Qtractor, at least this was an idea he
has outspoken about it. Isn't this a good idea?
Having cv additionally is good, no doubt about it. My final question,
the only question I wish to get an answer is: Even today MIDI is treated
as an orphan by Linux, if we would get cv, will there be any efforts to
solves MIDI issues with usual products from the industry? Or do we need
to buy special mobos, do we need to use special MIDI interfaces etc. to
still have less possibilities using Linux, than are possible with usual
products of the industry?
We won't deal with the devil just by using the possibilities of MIDI.
Today Linux doesn't use the possibilities of MIDI, I wonder if having a
Linux standard e.g. cv would solve any issues, while the common MIDI
standard still isn't used in a sufficient way.
I do agree that everybody I know, me too, sometimes do have problems
when using MIDI hardware, because of some limitations of MIDI, but OTOH
this industry standard is a blessing. Networking of sequencers, sound
modules, effects, master keyboards, sync to tape recorders, hard disk
recorders etc. is possible, for less money, without taking care from
which vendor a keyboard, an effect, a mobo is. Linux is an exception, we
do have issues when using MIDI. But is it really MIDI that is bad? I
guess MIDI on Linux needs more attention.
Internal Linux most things are ok, but networking with usual MIDI
equipment musicians, audio and video studios have got still is a PITA.
Cv would solve that?
2 Cents,
Ralf
> From: David Olofson <david(a)olofson.net>
> These issues seem orthogonal to me. Addressing individual notes is just a
> matter of providing some more information. You could think of it as MIDI
> using
> note pitch as an "implicit" note/voice ID. NoteOff uses pitch to "address"
> notes - and so does Poly Pressure, BTW!
Not exactly note-pitch. That's a common simplification/myth.
MIDI uses 'key-number'. E.g. key number 12 is *usually* tuned to C0, but is
easily re-tuned to C1, two keys can be tuned to the same pitch yet still be
addressed independently.
It's a common shortcut to say MIDI-key-number 'is the pitch', it's actually
an index into a table of pitches. Synths can switch that tuning table to
handle other scales.
A MIDI note-on causes a synth to allocate a physical voice. That physical
voice is temporarily mapped to that MIDI-key-number so that subsequent note
control is directed to that voice. The mapping is temporary. Once the note
is done the mapping is erased. Playing the same key later will likely
allocate a different physical voice.
The MIDI-key-number is therefore an 'ID' mapping a control-source to a
physical-voice.
> Anyway, what I do in that aforementioned prototyping thing is pretty much
> what
> was once discussed for the XAP plugin API; I'm using explicit "virtual
> voice
> IDs", rather than (ab)using pitch or some other control values to keep
> track of notes.
I agree that addressing notes unambiguously regardless of pitch (or any
other arbitrary property) is the ideal. I wish more sequencers were not
locked into a narrow 'western pop music' mode of operation.
But many MIDI alternatives have been proposed without looking deeply
enough to realise that MIDI already supports very flexible note control.
MIDI's significant flaw is it's grossly outdated 7-bit resolution, the
underlying voice model is sound.
> Virtual voices are used by the "sender" to define and
> address contexts, whereas the actual management of physical voices is done
> on the receiving end.
You have re-invented MIDI with different nomenclature ;-).
Best Regards,
Jeff McClintock
I finally couldn't resist anymore so I bought a TASCAM US-1641. At $299
it's hard to pass up. OK, so now the fun part. I loaded (shudder)
Windows and I've got Cubase 4 LE with the latest driver (2.00) and
firmware (1.02). I realize that there is no driver for the 1641 on
Linux. I've got a ton of programming experience (including some hard
real-time work) but I've never written a driver for Linux. I have no
idea where to start. I've read the following pages:
http://www.reactivated.net/weblog-content/20050806-reverse-0.2.txthttp://www.linux-usb.org/http://www.lrr.in.tum.de/Par/arch/usb/usbdoc/http://www.pps.jussieu.fr/~smimram/tascam/
I'm assuming that since ALSA is part of the kernel now I'll have to
compile my own kernel to play with things. If there's another way I'd
sure like to know about it. Any help would be greatly appreciated. If
anyone has any advice to help me get started you can either post it here
or send to me off list (that will at least save everyone else from
having to wade through all of my ignorant questions ;-) I may not be
able to figure this out but at least I'll give it the old college try.
Cheers,
Jan
> There is no way to refer to anything higher level, to say e.g. this is a
control message for note #12345 that started some time ago" could be done by
using SysEx.
FYI MIDI sysex does support that already....
[UNIVERSAL REAL TIME SYSTEM EXCLUSIVE]
KEY-BASED INSTRUMENT CONTROL
F0 7F <device ID> 0A 01 0n kk [nn vv] .. F7
F0 7F Universal Real Time SysEx header
<device ID> ID of target device (7F = all devices)
0A sub-ID#1 = "Key-Based Instrument Control"
01 sub-ID#2 = 01 Basic Message
0n MIDI Channel Number
kk Key number
Confirmation of Approval for MIDI Standard CA# __23__
Page 2 of 2
[nn,vv] Controller Number and Value
:
F7 EOX
SOME COMMONLY-USED CONTROLLERS
CC# nn Name vv
-----------------------------------------------------------
7 07H Note Volume 00H-40H-7FH
10 0AH *Pan 00H-7FH absolute
33-63 21-3FH LSB for 01H-1FH
71 47H Timbre/Harmonic Intensity 00H-40H-7FH
72 48H Release Time 00H-40H-7FH
73 49H Attack Time 00H-40H-7FH
74 4AH Brightness 00H-40H-7FH
75 4BH Decay Time 00H-40H-7FH
76 4CH Vibrato Rate 00H-40H-7FH
77 4DH Vibrato Depth 00H-40H-7FH
78 4EH Vibrato Delay 00H-40H-7FH
91 5BH *Reverb Send 00H-7FH absolute
93 5DH *Chorus Send 00H-7FH absolute
120 78H **Fine Tuning 00H-40H-7FH
121 79H **Coarse Tuning 00H-40H-7FH
Best regards,
Jeff
In the spirit of "release early, release often," I am pleased to
announce the release of Composite 0.006. This release marks the
completion of the LV2 Sampler Plugin, which supports Hydrogen drum
kits.
STATUS
------
Composite is a project with a large vision. Here is the status of the
different components:
composite-gui: Alpha (i.e. "a broken version of Hydrogen")
compoiste_sampler (LV2): production/stable, no GUI
libTritium: Not a public API, yet.
LINKS
-----
Composite: http://gabe.is-a-geek.org/composite/
Plugin Docs: file:///home/gabriel/code/composite-planning/plugins/sampler/1
Tarball: http://gabe.is-a-geek.org/composite/releases/composite-0.006.tar.bz2
Git: http://gitorious.org/compositegit://gitorious.org/composite/composite.git
HOW TO USE THE PLUGIN
---------------------
To use the plugin, you need the following:
* A program (host) that loads LV2 plugins.
* A MIDI controller.
* An audio output device. :-)
The following LV2 hosts are known to work with this plugin:
Ingen http://drobilla.net/blog/software/ingen/
lv2_jack_host http://drobilla.net/software/slv2/
The following is known to _not_ work:
zynjacku (Uses a different MIDI port type)
If you don't have a hardware MIDI controller, I suggest using
jack-keyboard (http://jack-keyboard.sourceforge.net/).
The first time you run the sampler, it will create a file
~/.composite/data/presets/default.xml, which will set up presets on
Bank 0 for the two default drum kits (GMkit and TR808EmulationKit).
Sending MIDI PC 0 and PC 1 will switch between the two kits. See
composite_sampler(1) for more information on setting up presets.
ACKNOWLEDGEMENTS
----------------
With this release, I would especially like to thank:
Harry Van Haaren - For help with testing. This release has much
more polish because of you.
David Robillard - For general help with Ingen and LV2. I also
stole a lot of great design ideas from Ingen.
Peace,
Gabriel M. Beddingfield
Hi guys,
I've run into some problem while restruckturing the audio backend of
TerminatorX. When I want to debug the Jack process callback, Jack throws
out my client. Is there some way of stepping through the process
callback, without having my client being shutdown?
Gerald
Has anybody been able to build freqtweak 0.7.2 using gcc 4.4.x recently? There are some missing stdint.h includes, but beyond that there is some loss of precision errors that are over my head.
-Reuben
On Fri, March 19, 2010 11:53, Ralf Mardorf wrote:
> IMO automation is overrated, it's useful, but OTOH how often is it
> needed to change settings during an opus? Most times a mix, selected
> synth etc. are fixed from the start to the end of an opus. For example,
> normally a musician plays an instrument dynamically by the touch or by
> using a volume pedal. Dynamic for the loudness seldom is done by a fader
> after the recording is done.
it obviously depends on what type of music you're making. loop based music
especially makes extensive use of automation.
Hi,
How to do automation on Linux when you work the 'modular way' and how is
the quality of those features on Linux at the moment? That question came
up and was followed by a quick research. There are things possible or
promising in the area, but we got some weak areas here also.
Non-daw and non-mixer seems to be very promising. You can make 'cv
tracks' or automation tracks in non-daw (when you add controls), draw
an automation line and connect that to an strip to automation for gain
(and LADSPA). DSSI is lacking here, but it is 'planned'. LV2 is not very
popular in the non project, so it seems it will not be possible to
automate LV2 plugin parameters, unless I miss something or an developers
step in and build LV2 support into non-mixer. (It would be a missed
chance imo if LV2 isn't supported here...)
But what to do when the softsynth is not an plugin, but an standalone
application, like phasex and zynaddsubfx/yoshimi? You can use an midi
sequencer to control the synth via sending midi cc messages.
Let me quote some quick (!) test results:
"My quick test shows that *Muse* can display cc's on the piano roll, but
only seems to allow me to edit them in the event list. Not very pratical.
*Non-sequencer* only offers the event list.
*Non-Daw* allows me to add lots of controls, and edit them happily on
the timeline, but it only outputs them as "CV", which means I'd need a
pd patch or similar to convert them into midi cc's.
I'll have a look at Rosegarden, once it's finished installing."
"*Rosegarden *could work, but apart from being bloated, it is also
rather complicated to add automation:
Draw event
Open in Martrix editor
Then under "View" I can add a controller, but for some reason only a
select few.
This controller can then be edited by right-clicking and adding
controller line.
I tried *Seq24*, but it keeps crashing on me. At least it allowed me to
choose a midi CC by number.
Damn, this should be a lot easier. Even the ancient Cakewalk was
lightyears better.
_I think we have a great opportunity here for some developer to make the
world a better place. Simply make *non-daw's* timeline controllers
output midi CCs"_
"*QTractor* actually works very well, only problem is it doesn't have a
curve-drawing feature, so you need to write lots of little automation
points. But it lets you select from all possible midi CCs. Just select
"controller" in the midi clip editor, where it normally says "note" (or
similar)
Then of course connect the midi output to your softsynth, and remember
to define what synth knob is controlled by what CC (the midi mapping) in
the synth."
"it looks like *Dino *has a nice curve editor."
"Midi mapping in *Phasex* and *AMS* seems to be ok.
And here is the *Zynaddsubfx *Midi implementation (seriously lacking, if
you ask me):
http://zynaddsubfx.sourceforge.net/doc_3.html
"
* So it looks like Qtractor is pretty good to do the job, all though
it misses the nice curve editor like Dino has, for automation with
external synths.
* Zynaddsubfx needs improvements when it comes to midi mapping.
* A big improvement would be to make *non-daw's* timeline
controllers output midi CCs
Other questions are: "Is it possible atm to automate LV2 plugins in a
host (lv2rack) and DSSI (ghostess)?"
See this post as sort of feedback from some Linux audio users. Maybe it
can bring us to some improvements...
Thanks,
\r
ps. see also thread here: http://linuxmusicians.com/viewtopic.php?f=4&t=2535
for more information's read here.
http://en.wikipedia.org/wiki/MIDI_beat_clock
my question, exist something like this for alas. i am interested to send midi beat clock
signals from hydrogen to external hardware synthesisers/arpeggiators. and i am explicit
not interested to sync them to any timecode. because the external machines have to run
independent and in a randomly order. they only have to sync there beats.
here the mbc specs.
midi beat clock defines the following real time messages:
* clock (decimal 248, hex 0xF8)
* tick (decimal 249, hex 0xF9)
* start (decimal 250, hex 0xFA)
* continue (decimal 251, hex 0xFB)
* stop (decimal 252, hex 0xFC)
and about ticks.
i fond out that linux audio apps all have other or there own definitions about the quantity of ticks per beat.
make it sense to find out an accordance about ticks per beat. or is this irrelevant for any syncing. especially i mean here syncing via jack-transport.
greetings wolke