On Thu, Feb 20, 2014 at 5:32 PM, Stefano D'Angelo <zanga.mail@gmail.com> wrote:
Hi all,

Let's say I have a client that introduces an amount of latency that's
variable at runtime and potentially unbounded. From JACK's docs it
seems that you need to recompute the min/max latencies in the latency
callback that's called "by the server" whenever it feels like, but you
can force that by calling jack_recompute_total_latencies (right?).

The problem is, you are advised to call this last function only after
calling jack_port_set_latency_range(), which you should only call in
the latency callback, which may be called next month... am I dumb
(probably) or is there a deadly loop?

the latency callback will be issued (twice, once for upstream, once for downstream) after one of two things happens:

   * the graph changes
   * a client calls jack_recompute_total_latencies()

you would call the latter if something happens that alters your clients own latency (e.g the change to some parameter of an algorithm that causes your client to change its latency). then you wait for the latency callback and reset your port latencies.