Ralf Mardorf wrote:
I never tested it myself, however, I remember that it
often is mentioned
not to use -n >2. Is there a reason to avoid -n >2 or is t juts a myth?
The buffer size is the product of the period size (-p) and the number of
periods (-n). A larger buffer size increases latency, but reduces the
risk of underruns. A smaller period size _slightly_ increases CPU usage
because of the overhead needed for handling a period.
Therefore, when optimizing for low latency, one typcially uses two
periods and makes -p as small as possible.
With USB devices, the period boundaries (where interrupts are supposed
to happen) are not necessarily coincident with the USB frame boundaries
(where interrupts actually happen). This results in delays (jitter) of
up to 1 ms in the timing of period interrupts; with very small buffer
sizes, this increases the risk of underruns greatly. So if, e.g., the
machine is not able to handle "-p 64 -n 2" reliably, increasing the
number of periods to 3 results in lower latency (3*64=192) than
increasing the period size (2*128=256). (Using "-p 96 -n 2" would have
the same latency, but works only if that particular Jack version allows
period sizes that are not a power of two.)
Regards,
Clemens