Here is the setup i'm looking to do:
midi master -> special program
ddr pad -> special program
special program -> hydrogen
I've got a hardware midi device controlling the master tempo. aseqdump shows
the tempo messages x times a second.
I want to either find or make a program so I can use a ddr pad (usb gamepad)
and assign different timed midi patterns to the buttons, so for example:
if the program is configured for 4/4 time, when I hold down or toggle a
button
if note 36 is a bass drum and 41 is a snare then you'd have a basic drum and
snare beat when both buttons are held down or toggled, of course i'd like to
be able to have multiple patterns bound per button as well
button one is set to play a quarter note midi note 36, then rest for the
remaining 3 beats
1 - 36,quarter|rest,quarter|36,quarter|rest,quarter
2 -
41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|rest,eighth
Each pattern would describe one measure, and the flow of a live solo
performance could easily be changed on the fly, leaving hands free to play
guitar.
Anything out there like this, and I just don't know about it?
If I don't find anything I want to hack something together, so a primer on
midi tempo sync, and how to program with it would be appreciated.
Thanks,
Nathanael
On Fri, Jul 16, 2010 at 5:27 AM, Geoff King <gsking1(a)gmail.com> wrote:
> Niels, Thank you for taking the time to work on improving Envy24Ctl.
> I tried your patch for the peaks and it seemed to work fine for me. I
> can't comment on the code (as I'm much more musician than programmer),
> but had no problem patching and installing. Looking forward to trying
> this new patch and whatever else you come up with. Geoff
Geoff -- thank you for trying out the patch! (And, OT, thanks for
working w/ Rui to help fix
http://sourceforge.net/tracker/?func=detail&atid=733076&aid=3021645&group_i…
:: "help fix qtractor crash on bus changing/configuration (3021645)" --
Envy24 and multichannel qtractor users, yes, it's time for a "svn up" ).
FYI, to try out the new levelmeters "easy", if you're running Fedora
x86_64, use the "envy24control" binary directly, or drop the
levelmeters.c file into your build:
http://nielsmayer.com/npm/Efficient-Meters-Envy24Control.tgz
( for details http://nielsmayer.com/npm/Efficient-Meters-Envy24Control.README )
........
On a completely different "note" the envy 24 manual (
http://alsa.cybermirror.org/manuals/icensemble/envy24.pdf )
has interesting info on the envy24 digital mixer that I've snapshotted:
Diagram: http://nielsmayer.com/npm/envy24mixer-architecture.png
Note the way it truncates in the mixer: the more inputs you "mix" at
once, the fewer bits each input source gets, and it's not clear what
kind of dither, or it's a straight truncate of 24... ultimately the
envy24 seems oriented towards producing a 16 bit, and not 24 bit
master (which makes sense given that the chip is well over a decade
old and HD audio production for "prosumer" was rare):
..........
4.5.5 Multi-Track Digital Monitoring
The Envy24 integrates a 36-bit resolution digital hardware mixer. The
width of the data path is strictly to
ensure that during processing of all the channels, under any
condition, no resolution is lost. The dynamic range
of the end user system will be limited by the range of the physical
output devices used. In order to maintain
identical gain to the input stream (i.e. 0dB), the resulting 24-bit is
not msb-aligned to the 36-bit. The overflow
bits correspond to the analog distortion due to saturation. The user
would need to reduce the overall attenuation
of the inputs to avoid clipping. Insertion of the digital mixer adds
only a single sample cycle delay with respect
to the original data. This extremely low latency all digital mixer
provides monitoring functionality and can
replace a traditional external analog input mixer. There are 20
independent audio data streams to mix and
control the volume. ...
..............
--Niels
http://nielsmayer.com
Hi :)
my PCI MIDI devices passed the alsa-midi-latency-test always with < 1.01
ms, on several Linux, with several kernels and graphic drivers. It's
audible that this can't be true. I was sceptic, when my USB MIDI passed
with around 2 ms, because recordings did show that there is much more
latency for the USB MIDI, when using Linux, just Windows is able to do
such less jitter on my machine, see the archives. I won't install
Windows again to compare the PCI MIDI. This computer can do better, but
MIDI fails for Linux. This comparison is just for testing the abilities
of the hardware.
Please test yourself what I did today and you'll be able to here it too.
It isn't just audible for gifted musicians, you'll really here that the
FluidSynth DSSI Kick and the Kick played by an external MIDI drum module
will be played one after another, but in unison. Note that I tested,
that this drum module still is 100% ok.
Please read the test and do the same as I did.
I decided to do the audio MIDI jitter test with 64 Studio 3.3 alpha. I
kept the proprietary 'nvidia' driver, even if the 'nv' driver, used by
64 Studio 3.0 beta, seems to cause x*10 µm less jitter.
First I checked the package 'rt-irq'. It's version 20090810-0ubuntu1
(karmic) and regarding to diff, I didn't over installed the current
version before. On Rui's download page http://www.rncbc.org/jack/ the
current version is 20090920.
I installed a dummy package rtirq-init_20090920_all.deb and updated both
rtirq files.
For /etc/default/rtirq I edited
line 30 from
RTIRQ_NAME_LIST="rtc snd usb i8042"
to
RTIRQ_NAME_LIST="rtc snd i8042"
line 33 from
RTIRQ_PRIO_HIGH=90
to
RTIRQ_PRIO_HIGH=98
I fixed the MIDI adaptor cable by screws to the 16:0 device and used the
audio IOs of the same card. It's the card with IRQ 20. Envy24 Control
automatically chose this card.
After shutdown and startup I did perform the ALSA MIDI latency test.
$ su -c "poff dsl-provider"
$ su -c "cpufreq-selector -g performance"
^C
$ cd /etc/init.d
$ ./rtirq status
PID CLS RTPRIO NI PRI %CPU STAT COMMAND
386 FF 95 - 135 0.0 S< irq/8-rtc0
1140 FF 90 - 130 0.0 S< irq/21-ICE1712
1137 FF 89 - 129 0.0 S< irq/20-ICE1712
379 FF 85 - 125 0.0 S< irq/1-i8042
378 FF 84 - 124 0.1 S< irq/12-i8042
102 FF 50 - 90 0.0 S< irq/9-acpi
565 FF 50 - 90 0.0 S< irq/14-ide0
570 FF 50 - 90 0.0 S< irq/16-ohci_hcd
587 FF 50 - 90 0.0 S< irq/22-ahci
597 FF 50 - 90 0.0 S< irq/19-ehci_hcd
598 FF 50 - 90 0.0 S< irq/22-ohci1394
601 FF 50 - 90 0.0 S< irq/17-ohci_hcd
603 FF 50 - 90 0.0 S< irq/18-ohci_hcd
606 FF 50 - 90 0.0 S< irq/17-ohci_hcd
609 FF 50 - 90 0.0 S< irq/18-ohci_hcd
932 FF 50 - 90 0.0 S< irq/7-parport0
1634 FF 50 - 90 0.3 S< irq/18-nvidia
1946 FF 50 - 90 0.0 S< irq/26-eth0
4 FF 49 - 89 0.0 S< sirq-high/0
5 FF 49 - 89 0.0 S< sirq-timer/0
6 FF 49 - 89 0.0 S< sirq-net-tx/0
7 FF 49 - 89 0.0 S< sirq-net-rx/0
8 FF 49 - 89 0.0 S< sirq-block/0
9 FF 49 - 89 0.0 S< sirq-tasklet/0
10 FF 49 - 89 0.0 S< sirq-sched/0
11 FF 49 - 89 0.0 S< sirq-hrtimer/0
12 FF 49 - 89 0.0 S< sirq-rcu/0
18 FF 49 - 89 0.0 S< sirq-high/1
19 FF 49 - 89 0.0 S< sirq-timer/1
20 FF 49 - 89 0.0 S< sirq-net-tx/1
21 FF 49 - 89 0.0 S< sirq-net-rx/1
22 FF 49 - 89 0.0 S< sirq-block/1
23 FF 49 - 89 0.3 S< sirq-tasklet/1
24 FF 49 - 89 0.0 S< sirq-sched/1
25 FF 49 - 89 0.0 S< sirq-hrtimer/1
26 FF 49 - 89 0.0 S< sirq-rcu/1
$ su -c "chgrp audio /dev/hpet"
$ su -c "sysctl -w dev.hpet.max-user-freq=64"
$ su -c "modprobe snd-hrtimer"
$ alsa-midi-latency-test -l
Port Client name Port name
14:0 Midi Through Midi Through Port-0
16:0 TerraTec EWX24/96 TerraTec EWX24/96 MIDI
20:0 TerraTec EWX24/96 TerraTec EWX24/96 MIDI
$ alsa-midi-latency-test -Rrw=5 -i16:0 -o16:0
> alsa-midi-latency-test 0.0.3
> set_realtime_priority(SCHED_FIFO, 99).. done.
> clock resolution: 0.000000001 s
sample; latency_ms; latency_ms_worst
0; 1.07; 1.07
9999; 1.00; 1.07
> latency distribution:
1.0 - 1.1 ms: 9994
##################################################
1.1 - 1.2 ms: 6 #
> SUCCESS
best latency was 0.99 ms
worst latency was 1.07 ms, which is great.
$ alsa-midi-latency-test -Rrw=5 -i16:0 -o16:0
> alsa-midi-latency-test 0.0.3
> set_realtime_priority(SCHED_FIFO, 99).. done.
> clock resolution: 0.000000001 s
sample; latency_ms; latency_ms_worst
0; 1.06; 1.06
timeout: there seems to be no connection between ports 16:0 and 16:0
I didn't check any connections, but run the test again. I've got no idea
what's wrong, when I get those timeouts. The adaptor cable with
opto-coupler is a brand new non self made one and the timeout can happen
for both cards, my old card S/N 00xxxxx with an Envy24 chip not by
Terratec and the new second hand card from Ebay S/N 04xxxxx with an
Envy24 chip by Terratec.
I didn't compare all the other chips, but the board's layouts don't
differ.
When doing the audio MIDI test the MIDInterface did work without missing
data.
$ alsa-midi-latency-test -Rrw=5 -i16:0 -o16:0
> alsa-midi-latency-test 0.0.3
> set_realtime_priority(SCHED_FIFO, 99).. done.
> clock resolution: 0.000000001 s
sample; latency_ms; latency_ms_worst
0; 1.06; 1.06
5727; 1.06; 1.06
9999; 1.01; 1.06
> latency distribution:
1.0 - 1.1 ms: 9992
##################################################
1.1 - 1.2 ms: 8 #
> SUCCESS
best latency was 0.99 ms
worst latency was 1.06 ms, which is great.
Compared to the test done with the old setup
http://lists.64studio.com/pipermail/64studio-users/2010-July/004537.html
nothing did change.
$ jackd -Rch -dalsa -dhw:0 -r96000 -p1024 -n2
jackdmp 1.9.5
no message buffer overruns
no message buffer overruns
JACK server starting in realtime mode with priority 10
creating alsa driver ... hw:0|hw:0|1024|2|96000|0|0|nomon|swmeter|-|
32bit
Using ALSA driver ICE1712 running on card 0 - TerraTec EWX24/96 at
0xcf00, irq 20
configuring for 96000Hz, period = 1024 frames (10.7 ms), buffer = 2
periods
ALSA: final selected sample format for capture: 32bit integer
little-endian
ALSA: use 2 periods for capture
ALSA: final selected sample format for playback: 32bit integer
little-endian
ALSA: use 2 periods for playback
$ qtractor
Version: 0.4.6
Build: Jun 3 2010 15:26:56
JACK Session support disabled.
A Yamaha DX7 was directly connected to hw MIDI in and an Alesis D4 was
directly connected to hw MIDI out. The analog audio IOs from the sound
card and the main outs from the Alesis D4 were connected to a Behringer
UB2442FX-Pro. So, valid home recording equipment, without any critical
stuff. Excepted to the limit for the sound quality, there shouldn't be
any issues caused by this setup.
For the MIDI audio test's LXDE session I'm running GNOME terminal tab 1
JACK2, tab 2 Qtractor. Evolution offline + this Email opened for writing
to the lists. Envy24 Control and nothing else.
For Qtractor I switched from queue timer (resolution) 'system timer
(1000 Hz)' to 'HR timer (1000000000 Hz)' and restarted Qtractor.
1.
I disconnected all audio connections for JACK and connected hw MIDI in
to hw MIDI out. Live I played forefinger left hand kick, thumb left hand
snare and forefinger right hand ride.
To get an impression for the latency I connected the DX7 MIDI out
directly to the D4 MIDI in and then I reconnected to the PCI card.
The difference is alarming :(.
Yamaha DX7 --> Alesis D4 results in a 100% musical groove.
Yamaha DX7 --> PC --> Alesis D4 results in extreme latency, it's hardly
possible to keep on grooving. This never ever is a latency of just 1 ms,
I'm sure the ALSA MIDI latency test is mistaken. Even a MIDI thru chain
by several MIDI devices won't cause such bad latency.
2.
I disconnected all ALSA MIDI connections. I disconnected the DX7 from
the hw interface.
Qtractor was connected to hw MIDI out, while hw MIDI out still was
connected to the D4.
3.
After adding a MIDI track I added a MIDI clip (from bar 2 to bar 4, 4/4,
120BPM) with those notes:
Time_____Note_____V._Duration
2.01.000 (36) C 1 64 0.00.240
2.02.000 (37) C#1 64 0.00.240
2.02.480 (36) C 1 64 0.00.240
2.03.480 (37) C#1 64 0.00.240
2.04.000 (36) C 1 64 0.00.240
2.04.480 (37) C#1 64 0.00.240
2.04.720 (36) C 1 64 0.00.240
3.01.000 (36) C 1 64 0.00.240
3.01.480 (37) C#1 64 0.00.240
3.02.480 (36) C 1 64 0.00.240
3.03.240 (36) C 1 64 0.00.240
3.03.480 (37) C#1 64 0.00.240
3.04.000 (36) C 1 64 0.00.240
3.04.480 (37) C#1 64 0.00.240
Loop play for hw MIDI by Qtractor is a disaster, hence I copied and
paste repeated it 39 times, pasted with the grid set to beat. The
'rhythm' then was from bar 2 to bar 82.
4.
After adding a second MIDI track, I copied all clips from the first to
the new track.
The second track isn't for hw MIDI, but FluidSynth DSSI.
I connected Qtractor audio out to system playback.
Before I could play with the hw MIDI track muted, Qtractor crashed.
I opened a third tab for the GNOME terminal and tried to kill Qtractor
without success.
$ su
# pidof qtractor
4173
# kill 4173
# killall qtractor
# killall -9 -w qtractor
^C
# killall -9 qtractor
I stopped JACK.
[...]
JackAudioDriver::ProcessAsync Process error
JackEngine::XRun: client = Qtractor was not run: state = 1
JackAudioDriver::ProcessAsync Process error
JackEngine::XRun: client = Qtractor was not run: state = 1
JackAudioDriver::ProcessAsync Process error
JackEngine::XRun: client = Qtractor was not run: state = 1
JackAudioDriver::ProcessAsync Process error
JackEngine::XRun: client = Qtractor was not run: state = 1
JackAudioDriver::ProcessAsync Process error
JackAudioDriver::ProcessAsync Process error
^Cjack main caught signal 2
Unknown error...
FATAL: exception not rethrown
Aborted
But ...
# killall -9 qtractor
didn't work.
FWIW the Alesis D4 was set up to
NOTE: 036 C1
Kik/43: Slammin'
PITCH: +1.00
VOL: 99
PAN: <>
OUTPUT: MAIN
MODE: MULTI
NOTE: 037 C#1
Snr/92: Big Stik
PITCH: +1.00
VOL: 85
PAN: <>
OUTPUT: MAIN
MODE: MULTI
CHANNEL: 01
As usual for 64 Studio 3.3 alpha several restarts resulted with issues.
I had to turn the computer off and on.
1.
Started GNOME terminal tab 1.
su -c "poff dsl-provider"
2.
Started Evolution in offline mode and opened this Email.
3.
GNOME terminal tab 1
$ su -c "cpufreq-selector -g performance"
Password:
^Cspinymouse@64studio:~su -c "chgrp audio /dev/hpet"
I wonder why Ctrl + C is needed after running cpufreq-selector.
For Lucid it's the same.
$ su -c "sysctl -w dev.hpet.max-user-freq=64"
$ su -c "modprobe snd-hrtimer"
$ jackd -Rch -dalsa -dhw:0 -r96000 -p1024 -n2
4.
Opened a second tab for GNOME terminal.
$ qtractor /mnt/music/all-in-one_supplier/PCI_MIDI_test.qtr
HR timer still is selected.
The hw MIDI track is muted.
The FluidSynth DSSI MIDI track is ready to play.
FluidSynth is using 808Set.sf2.
Qtractor audio out was connected to system playback and system capture
was auto-connected to Qtractor audio in. I now disabled auto connect,
but kept this audio connections, resp. I had to manually restore them,
perhaps disabling auto connect, did disconnect them.
I had to disconnect and manually restore Qtractor MIDI out, because it
didn't connect to the 16:0 EWX 24/96 as stored, but to the 20:0 EWX
24/96.
5.
I launched Envy24 Control.
6.
I pushed play for Qtractor transport.
I stopped it.
Set up gain for FluidSynth from the default -14 dB to 0 dB.
Played again.
Unmuted the hw MIDI track.
The notes played by the soundfont and by the D4 are fluctuating between
completely being out of sync and between being played unison.
I stopped playing.
RESULT
THE PCI MIDI INTERFACE WITH AN ALSA-MIDI-TEST-LATENCY RESULT OF < 1.1 ms
IS COMPLETELY USELESS TO MAKE MUSIC, BUT 'GREAT'.
AND IT'S AUDIBLE THAT THERE IS MUCH MORE JITTER BUT 1.1 ms.
Any hints how to solve this are welcome.
What information is needed?
Cheers!
Ralf
Shit, I tried to do some research, but
lad.linuxaudio.org and linuxav.org/mailarchive seem to be down :( and it
seems that my last mail didn't come through the list :(.
Has anybody else this trouble?
If not please send me an email.
Shit!
Ralf
Hello all developers. Sorry for, probably, slow and straightfull english.
All stand-alone instruments, processors and other modules, controlled through midi, as you understand, currently have a serious disadvantage of audio plugins: user must remember all midi parameter numbers and, sometime, values (aeolus stops switching) in order to controll them via MIDI.
It would be nice for all MIDI-managed software, to have capability to send out changes of MIDI parameters when user changes them using native GUI of this software. For example, when user toggles several stops on aeolus, it would send approriate MIDI signals, so, that when sent back into instrument, it will toggle the same stops.
So, users could use standalone software as easy as audio-plugins: Aeolus, Yoshimi/ZynAddSubFX (not sure, that all parameters available via GUI), Phasex... Jack-rack, fst and vsthost (dssi-vst), rakarrack, many other.
Kokkiniza! ;)
Transport issue for Qtractor - has impact to the jitter issue
So the advice to use amidiplay is something I'll follow soon.
Hi all :), hi Robin :), hi Devin :)
Robin, for 64 Studio 3.3 alpha the group has got read and write access
to /dev/hpet too. Btw '[...] | sudo tee [...]' for 3.3 alpha isn't good,
regarding to the enabled root account. Anyway, for this recording test I
kept the value 64 for hpet/max-user-freq, but I'll test higher values
soon.
Devin and some others want to know if the drum module is played before
FluidSynth DSSI is played.
JACK2 doesn't start with -p4096, so I started JACK2 with -Rch -dalsa
-dhw:0 -r44100 -p2048 -n2 to increase the unwanted effect, to get a
clear result.
Without recording it's clear audible that the external drum module
always is played before FluidSynth DSSI is played, plausible regarding
to the audio latency (stupid idea to use such high latency ;), so
there's the need to do an audio test with lower latency, to see if
jitter might change what instrument is played first.
Instead of a rhythm I did record 4 to the floor at 120BPM to 'see' the
jitter.
I did record FluidSynth DSSI by the sound card too. Left channel the
drum module, right channel FluidSynth DSSI and regarding to Qtractor's
graphic of the wavefrorms, there is jitter for both!
Ok, next recording with -Rch -dalsa -dhw:0 -r96000 -p512 -n2.
Without recording it's already audible that the drum module is played
first all the time ...
and it's visible too. Again there's jitter for both. The audio recording
of the drum module always is before the MIDI event. The recording of
FluidSynth DSSI sometimes is before and sometimes after the MIDI event.
There's no offset for the audio track.
I kept -Rch -dalsa -dhw:0 -r96000 -p512 -n2 and recorded FluidSynth DSSI
alone, internal Linux, without using the sound card.
The audio recordings are before the MIDI events and there's jitter. I
never noticed jitter internal Linux before.
I need to repeat the test ASAP, but by using 64 Studio 3.0 beta and
perhaps an older version of Qtractor.
Playing FluidSynth DSSI by MIDI and the recording made internal Linux in
unison, there isn't audible jitter. But after starting playing sometimes
MIDI and the audio recording are perfectly synced and sometimes there's
delay, real delay between the recording and MIDI, not only an early
reflection like effect (but without audible jitter, the jitter only is
visible by the waveforms).
$ qtractor -v
Qt: 4.5.2
Qtractor: 0.4.6
More maybe tomorrow.
Cheers!
Ralf
I have three applications that want to use the sound card, two audio stream players, and a voip phone.
I want to set up linux so that if a call comes in on the phone the OS
will disconnect the audio players, give exclusive access to the voip
phone, and then when the phone is done reconnect the audio players to
the sound card.
How can this be done?
I guess nearly ever mail sent to LAD returned with an issue regarding to
unable to deliver the message in the time limit specified to thaytan at
noraisin dot net, while no mail was send to somebody who obviously has
something seriously to do with GStreamer.
Perhaps this is unimportant, but because I don't know, I guess it's
better to inform about this 'issue' or 'non-issue'
- Ralf
No firewire here. I once had a MOTU, but I guess that there isn't a
driver for Linux and the guy who lend me the MOTU + Mac was Dirk Brauner
who isn't a friend anymore. I guess the MOTO was audio only. The people
who are still my friends don't have much different equipment, but I've
got. Always Envy24 based PCI, one friend has just more IOs for his
Envy24 based PCI card.
> Make sure that the MIDI device is being triggered before the soft
> synth before you post to LAD. If it ends up being the case, then go
> ahead and post it on LAD.
You're right I was stupid to spread to much speculations.
And yes, regarding to your knowledge you should join LAD.
- Ralf
On Wed, 2010-07-14 at 14:12 -0700, Devin Anderson wrote:
> On Wed, Jul 14, 2010 at 12:43 PM, Ralf Mardorf
> <ralf.mardorf(a)alice-dsl.net> wrote:
> > On Wed, 2010-07-14 at 12:30 -0700, Devin Anderson wrote:
> >> On Wed, Jul 14, 2010 at 10:29 AM, Ralf Mardorf
> >> <ralf.mardorf(a)alice-dsl.net> wrote:
> >>
> >> > Hi :)
> >> >
> >> > delayed by a thunder-storm I could do another test.
> >> > --snip--
> >>
> >> So, what you're saying is that your MIDI device and software synth
> >> sync up less and less as you raise the period size.
> >
> > Yes :).
> >
> >> I had presupposed
> >> before that your MIDI device was triggering *after* your software
> >> synth, but it occurs to me that it might be the other way around. Do
> >> you hear the audio from your software synth first, or from your MIDI
> >> device?
> >
> > I can't say it today, now I do some office work. I had the impression
> > that it might vary. Sometimes the virtual drum sampler and sometimes the
> > standalone drum sampler was played earlier, I need to check this ASAP.
> > For older tests with my USB MIDI device it was exactly that way, that
> > jitter had positive and negative delay. At least the recorded waveforms
> > of external MIDI equipment (when I used USB MIDI, now I'm using PCI
> > MIDI), were recorded by Qtractor, before theoretically the MIDI event
> > was send ;). Note! Qtractor had no latency compensation, all recorded
> > audio of external MIDI instruments should have (positive) delay, but
> > negative delay.
>
> If it ends up being the case that your MIDI device is being triggered
> before your software synth, then I'm guessing that the issue here is
> not MIDI jitter. I'm guessing the issue is that the latency that's
> imposed by JACK on incoming and outgoing audio is not imposed on
> incoming and outgoing ALSA MIDI. So, while the audio coming out of
> the software synth is delayed by a certain amount of frames imposed by
> JACK, the audio coming out of your MIDI device is only delayed by the
> latency of the ALSA drivers, the latency of the MIDI ports, the
> latency of your MIDI device.
>
> This would certainly explain why the problem gets worse as you raise
> the period size, and could explain why you had positive and negative
> delay in your older USB MIDI tests, as the reported MIDI jitter in
> your tests was *far* worse in your older tests than it is now.
>
> At the moment, I happen to be doing some work in JACK 2 that could
> potentially solve this issue by enabling MIDI to sync more closely
> with audio, so I'm very curious to know if my suspicions are correct.
> Please keep me updated. :)
Should I build JACK dummy packages for 64 Studio and daily get JACK2
from svn co http://subversion.jackaudio.org/jack/jack2/trunk/jackmp ?
I wonder if this should be cross-posted to LAD?
On LAD and the 64 Studio list are people with much knowledge and your
reply might hit the nail on the head.
- Ralf