[LAD] JACK and CPU stress testing

Robin Gareus robin at gareus.org
Fri Mar 1 17:28:48 UTC 2013

On 03/01/2013 12:41 PM, Harry van Haaren wrote:
> Hey all,
> I'm currently attempting to stress test my setup of  -rt kernel, rtirq
> scripts, and a Jack client program I've been working on.
> So my idea is to create a script that runs the programs, and also a
> cpu-load generating program (cpuburn or alternative).

There's a tool to cause jack DSP load:
which may or may not come in handy.

I've hacked it somewhat for testing CPU freq scaling a while ago:

> Then collecting stats based on Xruns, % DSP load, etc.

drobilla and me did a similar analysis for LV2 plugins. Basically
measure the time it takes to process N samples for various block-sizes
and input data. The total execution time is only significant for a
single machine, however we were interested in variations (error bars).

If processing always takes the same amount of time per sample and is
independent of the input data, the algorithm is very likely RT safe.
Furthermore if there is no significant relationship to block-size that's
even better.

The scripts and data is available at

> I intend to show (trough brute force) that an application is RT capable on
> machine X with a latency of Y ms.
> Of course this won't be 100% representative, but the stats will show some
> RT-safe-ness.
> Has anybody done this kind of profiling / stress testing with JACK before?
> Hints / tips / advice / etc welcomed! -Harry

More information about the Linux-audio-dev mailing list