On Sat, Aug 15, 2015 at 10:27:33PM +0100, Will Godfrey wrote:
As a matter of curiosity I wonder if we can work out
what the actual difference
is. Would the Xrun occur when there was a one frame difference, one period, or
1 buffer? It's running at 256 frames per period and 2 periods per buffer. The
xruns occur every 11 minutes (and a few seconds).
If you assume that an xrun occurs when the difference in samples is
one period, then the difference in sample frequency is P / T, with
P the period and T the time in seconds between xruns.
With the values you give:
dF = 256 / (11 * 60) =~ 0.39 Hz.
You can measure this easily using jack_delay (or jack_iodelay).
The change in measured latency, in samples, per second is the
difference in sample rate.
Ciao,
--
FA
A world of exhaustive, reliable metadata would be an utopia.
It's also a pipe-dream, founded on self-delusion, nerd hubris
and hysterically inflated market opportunities. (Cory Doctorow)