On 4/4/19 6:51 PM, Ralf Mardorf wrote:
On Thu, 2019-04-04 at 18:19 +0200, Alexandre Bique
I've tried with qjackctl to set the
periods/buffer to 12 with 2048
samples per buffer at 44100 Hz which is about half a second of
latency. Do you manage to get the correct latency reported with those
settings? See https://imgur.com/w45wsp8
what's wrong with QjackCtl's calculation?
For the calculation I found a page that seemingly contains a typo.
"(Frames [or buffer] / Sample Rate ) * Periods = Latency in ms" -
It should read "s" not "ms".
(2048 frames / 44100 Hz sample rate) * 12 periods = 0.557 s IOW 557 ms
The calculation seems fine to me, but do you get the same latency
reported by jack to the clients? This is the issue.