Hi *,
I wanted to start a discussion of what kind of AVB connectivity makes the most sense for jack.
But please keep in mind what AVB is and isn't.
a) Fully functional Backend
b) Media Clock Backend with AVB jack clients
pro a)
- the AVDECC connection management could be done seamlessly in a jack way
- out of the box avb functionality
con a)
- only one talker/listener, single audio interface
- huge programming effort
- no dynamic audio mapping
pro b)
- multiple talkers/listeners with multiple audio interface using alsa api
- avoiding huge code addition to the backend, thus much easier to maintain
- AVDECC handling per client for dynamic audio mapping
con b)
- cpu load... price for multiple talkers/listeners
I'm excited to hear your opinions!
Best,
Ck
--
Diese Nachricht wurde von meinem Android-Gerät mit K-9 Mail gesendet.
Hello all.
I have embarked on a long journey of porting my radio automation and
mixing system from OSX (making extensive use of Apple's CoreAudio API)
to a Linux/JACK/gstreamer system and potentially, making it a cross
platform application. The application is actually a collection of
programs, ranging from a studio UI, library management UI, database
back-end, and an faceless audio-engine/mix system. I am currently
working on the audio-engine port, since potentially I could port that
and continue to use the old UIs on Macs until I can get those ported as
well.
In order to make this manageable, I decided to pull some audio features
out of my existing audio-engine, notably AudioUnit effects hosting, IAX
phone support, and multiple audio-device support. I should be able to
add VL2 effects hosting and IAX back in at some point. But for the
short term, I can use jack to host effects in another program for now,
and just give up IAX/asterisk integration. The multiple audio-device
support, via my own clock drift measurement and re-sampling scheme, is
a loss for now, but it looks like alas-in and alas-out programs could
be stop-gap approaches to that, and appears to use a similar underlying
approach.
So far, the porting is going much faster than I had expected. Jack2 is
a very clean and well though out API. It is not unlike CoreAudio, only
it's MUCH easier to use. Leveraging JACKs inter-application audio
routing, I have further broken my audio-engine down into a mixing
system and control session host, with separate programs executed for
playing and recording/streaming audio content. This a very nice, clean
way to break thing up, made possible by JACK. Thanks.
That is the big picture. I have come across some questions that I need
a little help with, all regarding the start-up process of the jackd
server:
1) At first, JACK seems to be designed for a single jackd server
running on a system, with the server control API lacking, as far as I
can tell, a method to control multiple servers by name. But then, with
the name command line option, I can actually run multiple servers tied
to different audio devices. Does the server control API simple not
support names at this point? Maybe this is just how JACK evolved,
originally one jackd per machine, then named server support added, but
not fully implemented across all the API?
2) Again, with the named server: I can start a named jackd server. I
can connect to it using QjackCtl by setting the name property in
the advanced tab of QjackCtl settings. But when I try to connect to the
named server from my audio-engine application, via a jack_client_open()
call, passing the name char string, my application instead tries to
start up a jackd server instance, using the command noted in the jackd
man page (first line of $HOME/.jackdrc). Is this a bug, or am I
missing some detail?
3) Regarding the jack_client_open() behavior when no server is yet
running: the function call does seem to execute the first line of
$HOME/.jackdrc and start the jackd server running. However, it
appears to inherit all the file descriptors from my application. This
is a problem because my application is designed to self-restart on a
crash. With jackd holding my application's TCP control socket open, my
application can't restart (bind again to the desired TCP port) until
after I kill the jackd process. I assume the auto-jackd startup code
is forking and execing, and the code simply isn't closing the parent's
file descriptors. Is this a bug or intentional? Is there a way I can
detect if a jackd server is running ahead of time, so I can start the
server myself using my own fork/exec which would closing my descriptors
on the child, then, once I know jackd is running, call
jack_client_open() in my app?
4) because my audio-engine is a faceless application, and can be run
without a desktop session, I need it to be able to connect to a jackd
server run by other users, or to start a jackd server that can be used
by other users. I can verify that with Ubuntu 18.10, I can not start
jackd from a user account, then connect to it from my application
running as a different user, even if I run my app as root, not that I
intend to do that as a real world workaround. Is there some approach,
group permissions possibly, to allowing other users to access a jackd
server?
5) Wishful thinking: Give jackd the ability to read QjackCtl config
files, so I could configure things from the GUI, then stop jackd and be
able to restart it from the server-control API or command line with a
command line option pointing to a config file. Better yet, make a
persistent JACK-aware place to store such file in the file hierarchy.
Thanks,
Ethan Funk
Hello, having trouble with the latency API.
I start Jack with this (the 6 periods are just for testing):
jackd -P30 -dalsa -r44100 -p512 -n6 -Xseq -D -Chw:M1010LT,0 -Phw:M1010LT,0
One Jack audio input port, one Jack audio output port,
one Jack midi input port, and one Jack midi output port
have been registered and connected to audio hardware ports
and midi hardware port respectively. All is functioning.
The ports were created with these calls:
jack_port_register(_client, name1, JACK_DEFAULT_AUDIO_TYPE,
JackPortIsInput, 0);
jack_port_register(_client, name2, JACK_DEFAULT_AUDIO_TYPE,
JackPortIsOutput, 0);
jack_port_register(_client, name3, JACK_DEFAULT_MIDI_TYPE,
JackPortIsInput, 0);
jack_port_register(_client, name4, JACK_DEFAULT_MIDI_TYPE,
JackPortIsOutput, 0);
So far so good. But when I ask for the latency of the
ports, the Jack midi ports are always reporting zero:
Jack audio client capture port:
Capture range: min 512 max 512
Playback range: min 0 max 3072
Jack audio client playback port:
Capture range: min 0 max 512
Playback range: min 3072 max 3072
Jack midi client capture port:
Capture range: min 0 max 0
Playback range: min 0 max 3072
Jack midi client playback port:
Capture range: min 0 max 512
Playback range: min 0 max 0
What could be wrong here?
Thanks.
Tim.
Hey!
I have a hard time setting up a tmux session started with systemd on
raspbian (patchbox OS, realtime kernel) to behaves like as if I started
my script manually.
This is my Jack Capture service:
[Unit]
Description=Jack Capture autostart daemon
Wants=jack.service
After=multi-user.target
[Service]
Type=forking
EnvironmentFile=/etc/environment
User=patch
Group=realtime
ExecStart=/home/patch/jack-capture-tmux.sh
[Install]
WantedBy=jack.service
and this my script to start tmux with jack capture:
#!/bin/bash
. /etc/environment
/usr/bin/jack_wait -w
tmux new-session -d -s jackd
tmux new-window -t jackd -n Jack Capture -d
sleep 1
tmux send-keys -t jackd:0 'cd ~/' C-m
tmux send-keys -t jackd:0 'clear' C-m
tmux send-keys -t jackd:0 'jack_capture -c 2 -mb -tm -f wav
--filename-prefix ABC-TEST- --hook-close ./hookstop.sh --hook-timing
./hookstart.sh' C-m
everything a bloated from all my tries to make this work...
Now it starts properly but if I look into that tmux session I get
warnings/errors:
jack_capture -c 2 -mb -tm -f wav --filename-prefix ABC-TEST-
--hook-close ./hookstop.sh --hook-timing ./hookstart.sh
Cannot lock down 82287136 byte memory area (Cannot allocate memory)
>>> Warning. Could not set higher priority for a SCHED_OTHER process
>>> using setpriority().
Cannot use real-time scheduling (RR/70)(1: Operation not permitted)
JackClient::AcquireSelfRealTime error
>>> Waiting to start recording of "ABC-TEST-10.wav"
>>> Press <Ctrl-C> to stop recording and quit.
I had this solved when starting the script manually by adding * - nice
-20 into /etc/security/limits.conf - but this obviously is not working
when starting the script within systemd (although same user).
Any ideas?
Cheers
Jan
Hi list,
(Sorry this question has already been discussed but the search function in the archive of the list does not work)
Is anyone working on the problem of getting Jack up and running on MacOS again? I installed Jack with brew but I get these “could not handle external client request” errors which seems to be old problems from looking at search results and GitHub discussions - problems introduced when MacOS audio architecture was changed a while back. Is there any progress on these matters? Is someone working on it? I need Jack as it is a dependency to another library I would like to use but I realize that I might have to abandon MacOS as a platform in the process.
Kindest regards,
Stefan
Hi,
I've been reading the code of Ableon's LinkHut's jack backend.
https://github.com/Ableton/link/blob/master/examples/linkaudio/AudioPlatfor…https://github.com/Ableton/link/blob/master/examples/linkaudio/AudioPlatfor…
And I was quite surprised because I did not find any latency compensation there.
So for real-time beat time synchronization this is quite important right?
BTW, if any jack expert finds the time and motivation to fix the jack
back-end of Link's example that would be awesome and a great example.
Then I've read the documentation:
http://jackaudio.org/api/group__LatencyFunctions.html
And I am totally confused.
Why a latency range? What is the use case for it?
How is the application supposed to work with multiple inputs and
outputs having each a different latency range?
What's the point of a latency range? If you're concerned about the
shortest path and the longest path, why wouldn't you be concerned
about any path in between?
How to know at which time on a given a clock (CLOCK_REALTIME or
CLOCK_MONOTONIC) a sample in the current process block will be played
on the sound card's output?
I've tried with qjackctl to set the periods/buffer to 12 with 2048
samples per buffer at 44100 Hz which is about half a second of
latency. Do you manage to get the correct latency reported with those
settings? See https://imgur.com/w45wsp8
Jack allows feedback routing, how do you deal with latency with such a graph?
I've been testing with Jack 1.9.12-8 on Archlinux and QJackCtl 0.5.6-1
Many thanks for your answer and your time,
Alexandre Bique
Folks,
I've been lurking here since the early 2000's, since the stuttering
beginning of jack (jaaa was it called then ?), and I'm amazed of the
ugly arrogance we recieve these days. "I have not contributed any kind
of idea or code, by still I know better". This pisses me off and get me
depressed. Jack, for one, is a piece of software I'm amazed about. I've
learned more reading this list than in any other so-called software
course.
Please show humility where it is due, ask questions and offer inisight
anytime, but keep your selfish, narrow minded view for yourself until
you can provide any educated opinion.
Thanks for reading. To all the jack devs, you have my thankful respect.
-- D.