[LAD] Strange inconsistency in jack frame time

Fons Adriaensen fons at linuxaudio.org
Sun Jun 22 21:38:55 UTC 2014


Hello all,

While putting the finishing touches on zita-njbridge and doing a lot
of testing I stumbled on this strange behaviour of jack's frame time.

Let F(i) be the frame time corresponding to the start of period 'i',
and U(i) the corresponding microseconds time.

If jack for whatever reason skips some periods, then one would 
expect the difference of F(i) and U(i) to be consistent. They
are in some but not in all cases. 

This is the output from a test. When the frame time makes an
unexpected jump, it prints the difference of F, the difference
in U converted to frames, and the difference of these two.
Period size is 256 frames.

dframes =   768    769.0     1.0
dframes =  1024   1322.8   298.8
dframes =  1024   1024.7     0.7
dframes =   768   1067.3   299.3
dframes =  1024   1023.7    -0.3
dframes =   768   1068.5   300.5
dframes =   768    769.2     1.2
dframes =   768   1067.3   299.3
dframes =  1024   1023.9    -0.1
dframes =  1024   1323.1   299.1
dframes =   768    768.2     0.2
dframes =   768   1066.7   298.7

The cases where F and U match are due to starting another client
(designed to be 'heavy'), those where F and U do not match occur
when the same client is terminated (using ^C). The error for 
those is consistently a full period plus around 4333 frames.

So it seems that the frame time is not a reliable method to
determine how many frames have been lost. 

Using jack1, 0.124.1

Ciao,

-- 
FA

A world of exhaustive, reliable metadata would be an utopia.
It's also a pipe-dream, founded on self-delusion, nerd hubris
and hysterically inflated market opportunities. (Cory Doctorow)



More information about the Linux-audio-dev mailing list