Hi,
Thank you for your great support in Jack.
I am trying to build a real-time application on basis of Jack Server and Client architecture. But Jack capture and playback latency is failing me to make the application sample accurate.
My Jack server configuration (jackd -dalsa):
JACK server starting in realtime mode with priority 10
self-connect-mode is "Don't restrict self connect requests"
audio_reservation_init
Acquire audio card Audio0
creating alsa driver ... hw:0|hw:0|1024|2|48000|0|0|nomon|swmeter|-|32bit
configuring for 48000Hz, period = 1024 frames (21.3 ms), buffer = 2 periods
ALSA: final selected sample format for capture: 32bit integer little-endian
ALSA: use 2 periods for capture
ALSA: final selected sample format for playback: 32bit integer little-endian
ALSA: use 2 periods for playback
My Understanding :
0. Realtime mode = Yes, 10 priority
1. Sampling Rate = 48000Hz
2. 1 Periods = 1024 frames
3. Buffer Size = 2 Periods = 2048 frames
4. Capture Latency = 2 periods = 1 Buffer = 21.3 x2 ms
5. Playback Latency = 2 periods = 1 Buffer = 21.3 x2 ms
Apllication:
passthru/simple_client.c of Jack2 repositry with one client for both sterio capture and mono playback
My application pipeline:
input -----------> capture left channel (stage 1) ------------> playback left channel (stage 2) --------------> capture right channel (stage 3)
Latency calculation:
After stage 1 --------------- 2 periods = 42.6 ms
After stage 2 --------------- 2 + 2 periods = 85.2 ms
After stage 3 --------------- 2 + 2 + 2 periods = 127.8 ms
Difference between stage 3 and stage 1
6 periods - 2 periods = 4 periods
Latency difference should be 4 periods but I am getting larger than 4 periods like 4 periods which also differs with different trials.
So, how can I make my application sample accurate or what is the correct procedure?
It will be a great help and advance thanks.
Best Regards
Ruhul Amin