Hello,
Here is the use-case. Sending audio from one application to be processed
by another jack client before being sent back, the processed signal
being mixed back in, possibly with a copy of itself. How would one go
about estimating by how much the processed signal was delayed?
1) JACK base latency:
My understanding is that JACK will always introduce a latency of
buffer_size*period_size*nperiods, is that correct? If the signal is
sent, processed, then sent back, the acquired delay would at least be
twice the nominal jack latency, right?
- Problem: On the practical side, how could we calculate the base
latency using available jack utilities? There's jack_bufsize and
jack_samplerate, but no way to find the number of periods, I think.
2) Latency of the processing client.
That would depend largely on the client, I guess. The README.CONFIG for
jconvolver states that setting the partition number to be equal to the
jack period size would result in zero latency, for example...
Any insight in this matter would be greatly appreciated.
Cheers,
S.M.
--