I read a thread about issues, when restoring connections at jack devel
[1], that reminds me to send a request.
I won't quote the complete thread ;).
However, isn't it possible to care about the IRQ of several equal sound
cards, when restoring MIDI connections? The "name" of my TerraTec cards
is the same for both cards I use, but the IRQ for those cards is
different for both and always the same for both. IOW neither the
QJackCtl patchbay nor aj-snapshot are able to distinguish between the
equal names of the sound cards, but theoretically it should be possible
to take care about the IRQs of those sound cards to distinguish them.
[1]
-------- Forwarded Message --------
From: Ralf Mardorf <ralf.mardorf(a)alice-dsl.net>
To: jack-devel(a)lists.jackaudio.org
Subject: Re: [Jack-Devel] 'connect' vs 'patchbay' and the problems of
everything not working as expected..........
Date: Sun, 26 Jan 2014 21:01:10 +0100
Mailer: Evolution 3.10.3
"Asio4All" :S
On Linux it's better to use aj-snapshot, than to use QJackCtl, but even
aj-snapshot can't restore all connections, if you use several equal
sound cards.
The QJackCtl patchbay "rules" do work in many cases, but aj-snapshot
will restore what you definitively set-up, assumed you don't use several
equal devices.
Isn't there something similar to aj-snapshot for Windows too?
Looks like it might be useful for custom linux instruments, using a Raspberry Pi or BeageBone, or even just for controlling Ardour on a PC:
http://www.indiegogo.com/projects/pitouch-hdmi-multitouch-monitor-for-raspb…
Endorsement/disclosure: I have done some linux and Android development work for this guy in the past, and helped him research linux support for this project (which was very quick and easy: the multitouch is built in to kernel 3.2 and later, works out of the box).
The audio version uses HDMI, but for input one could use an external USB interface (with the USB hub version on a Pi/Beagle), or something more custom. If used with a PC instead, then one could use FFADO, etc.
-ken
Hey all,
I've been reading with interested that some of you use netbooks/small
form factor pc's/rasberry pi's on stage for live performance. I'm
currently developing an LV2 plugin and am wondering what the lowest
common denomination screen resolution I should be shooting for.
During performance, what if anything, do you actually need to see apart
from what the the current preset is? Is there a use case for making all
parameters for a synth plugin (for example) available/accessible on
screen, or using midi learn to hide/unhide parameter tabs etc. Examples
of current programs plugins would be great.
cheers,
Brendan
Slightly off topic of linux audio, but I need some MIDI advice and this is
the only list I currently monitor with electonic musicians.
I have a few hardware synths that I would like to begin using with my
computer music setup. The last time I had them all set up I did not have
a computer included in the system, so I had somewhat simpler connection
needs.
I have a MIDI interface with one port, and wondered if there is a better
way of managing the MIDI connections than daisy chaining in and through
ports on the synths. Especially for the input of the computer card, I now
have more than one controller that I would like to alternate as the MIDI
generating device.
I think the essence of my question is what is the best arrangement for
something like a patchbay for MIDI.
There is a company called MIDI Solutions which has merge and splitter
devices powered from the MIDI connection:
http://www.midisolutions.com/products.htm
That is the only current device I found. All the previous manufacturers
of MIDI patch/merge devices seem to have stopped making any devices
similar to the Roland A-880, Kawai MAV-8, Digital Music MX-8, etc. that I
have found searching ebay.
My first instinct is to buy or build a real physical patch bay, with a row
of DIN connectors and patch cables to let me patch any device to computer
in, computer out to any device in, and through from any device to any
other device. Sounds like a pain now that I write it out, and something
that should be done from an electronic interface rather than connectors
and cables.
What is everyone else with a large collection of MIDI hardware doing?
Stacking up USB MIDI interfaces on a USB hub? Just hanging on to the
vintage MIDI patch bays? Is there a current production model I have
missed?
--
Chris Caudle
El 12/10/2013 22:37, "Will Godfrey" <willgodfrey(a)musically.me.uk> escribió:
>
> On Sat, 12 Oct 2013 13:01:47 -0700
> Ken Restivo <ken(a)restivo.org> wrote:
>
> > Tonight I'll be playing the linux netbook on a few songs at my friend's
band's show, at the Red Devil Lounge in San Francisco:
> >
> >
http://www.reddevillounge.com/event/375173-afton-live-mrmime-rippin-san-fra…
> >
> > We've been friends for 30 years, I produced the original demos for his
songs over 20 years ago, was in his band briefly 15 years ago when he was
recruiting people and before they started gigging, played on his CD a few
years ago, but have never actually been onstage with him in all these
decades, for whatever reason. So, tonight, that happens.
> >
> > There will be Monosynth, mostly, and that's about it.
> >
> > -ken
>
> Nice one! Hope it all goes well.
>
> --
> Will J Godfrey
> http://www.musically.me.uk
> Say you have a poem and I have a tune.
> Exchange them and we can both have a poem, a tune, and a song.
> _______________________________________________
> Linux-audio-user mailing list
> Linux-audio-user(a)lists.linuxaudio.org
> http://lists.linuxaudio.org/listinfo/linux-audio-user
Hi all (again),
Love to watch some recent video of those live performances of yours, Ken.
Kindest regards for all.
Hello,
I'm seeing that as soon as I trigger a sound on qsynth (eg. a drum
pad on the Axiom 25) the audio gets garbled a lot, becomes very noisy,
and I have to terminate qsynth. qsynth version 0.3.6-2, Linux Mint 14
64bits. fluid-soundfont-gm and -gs version 3.1-5. It actually recently
started to do that once in a while, and now it is unusable as it
immediately outputs growing noise as soon as a note is played. A bit
strange because usually software does not get worse like an old car can.
I've shut down Ardour, and started qsynth and now it does that by
itself, after a few seconds. Like a highly distorted growing larsen
effect, becomes bery loud, and then fades away in crackling bits. Then
the audio cannot be used. If Ardour was playing, the Ardour sounds
becomes fully distorted with crackling asounds. Have to stop qsynth,
then the Ardour audio becomes normal again. When qsynth is stopped,
there's a click and audio becomes normal.
Using qjackctl, I disconnect all audio from system and PulseAudio
Jack Sink. All MIDI also disconnected. When I started qsynth, two
connections were made from qsynth to playback1 and 2, which is OK. And
then after a few seconds noise appeared. W/o any note being played.
There does not seem to be an update to the qsynth package (at least
in the scope of Linux Mint 14).
Hopefully, other people have noticed this problem !
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi,
in my spare time I'm working on a console based jack_midi sequencer,
called teqqer.
Since this is a console app I'm interested in feedback in how it works
especially with interfaces for sigh impaired people (braille, etc).
The software is in alpha stage, so expect much missing functionality,
but I suppose at this stage it is not too late to change stuff (or add
stuff) to make it work better with these kind of interfaces.
So if anyone is so inclined, please try
https://github.com/fps/teqqer
The README.md has build instructions.
Running
./teqqer example.teq
should load an example song.
Using Meta-h h should bring up a help text (also reachable by the menu
at the bottom of the screen) with the default keybindings.
Thanks,
Flo
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.14 (GNU/Linux)
Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/
iQEcBAEBAgAGBQJS4O3VAAoJEA5f4Coltk8ZQuQIAKJmYvBwVLj3qK644grZgy4w
H7TLWNd3D0cOJJzbenLBavl6gQ2T/kAlcG5Ha/m84XK1UFpJusNI4S4OGW9Dl7Kr
VJZORjTju/O1XlfEtYpHMBP3KhwI/4hawjMQ51kuZs5YQxqMsdIqmBnC9MNNsE9T
P+b7/X3hMNYb2GVkRc1RModCPqlSGy7VvoC2nHwukxrKYZKnIPXLRsoBjJYAGkbB
OqscJr3EgVSjUcIK8Ko6GKZAnZc7jKbm1FxDnqMJulcIt6TRh/KEuOGs2mqxGFMf
9Qyus1JC5yEz9vzG3Aq+e0P5fuQVSde5PSXUlGk39YfN6E9nFLo1iwl6pboAPI0=
=Sf4u
-----END PGP SIGNATURE-----
Hello all,
I sit possible to monitor a to-be-recorded Ardour track without
actually recording it. By monitoring I mean hearing all effects that
are on that channel. What I currently do is to add a send on the track
that is sent to channel 8 (1010LT card). So then, channel 8 outputs
the instrument plus all effects applied to it (assuming the send is at
the end of the chain). But, it only works when recording is on. Not
so much of a problem, as the track can be deleted, although it could be
nice not to have to press record to hear the monitoring, just to
practice or fool around.
Hello,
I have some tracks in Ardour, acoustic instruments, to which I
counted 8 metronome ticks before starting recording. So there are 8
clicks of 'silence' before the instruments starts. This is a sketch,
so I'd like to play around with Renoise while the instruments are
playing, especially pattern-based. The problem is that Renoise starts
immediately, same time as Ardour, so that music starts happening when
Renois is already at 32 (out of 64, default size).
I tried adding a metronome pre-count in Renoise, then use Renoise to
start Ardour, but the pre-count does not take effect. Is there a way
to be able to play around looping a pattern in Renoise while Ardour
plays tracks, assuming that the tracks were recorded with Ardour's
metronome, and that a 64 pattern in Renoise fits nicely in what is
unfolding as recorded.
In other words, is it possible to jam in Renoise while audio tracks
are playinbg - surely it is ! But is it possible to do it using a
metronome count-in before looping a pattern ?
Of course, any suggestion of achieving this in a better way is
welcomed ! (ツ)
Cheers.
Hi,
I have found a few different suggestions online for using icecast with
jack. It seems that some people have been doing it for over 10 years now.
However with the debian packages for icecast2 and ices2 I receive this in
the logs:
[2014-01-24 09:19:08] INFO ices-core/main IceS 2.0.1 started...
[2014-01-24 09:19:08] EROR input/input_loop No input module named "jack"
could be found
[2014-01-24 09:19:08] INFO ices-core/main Shutdown complete
- Here is a reference for the ices.xml file:
http://io.rg42.org/trac/browser/misc/ices2/ices-jack.xml
- I also found this project which claims to have jack support but I cannot
find/download the ices-kh package.
http://karlheyes.github.io/https://aur.archlinux.org/packages/ices-kh/
So, what's the magical procedure?
Also why doesn't the debian package of icecast have support for jack? It
has ALSA, OSS, even RoarAudio support* so is it an acute political
decision that I am not aware of?
[*]http://www.icecast.org/docs/ices-2.0.2/inputs.html
- Alternatively what is the correct way to pipe jack into the ices2
"stdinpcm" plugin.?
ex. ecasound -i:jack -o:stdout | ices2 ices-jack.xml
--
Patrick Shirkey
Boost Hardware Ltd