On Sun, Jun 22, 2014 at 06:16:04PM -0400, Paul Davis wrote:
On Sun, Jun 22, 2014 at 5:38 PM, Fons Adriaensen
<fons(a)linuxaudio.org>
wrote:
So it seems that the frame time is not a reliable method to
determine how many frames have been lost.
I don't think that anything in the API documentation suggests that it is.
xruns, whether caused by clients or kernel scheduling, essentially reset
the clock. nothing in jack1 (and i believe nothing in jack2) attempts to
maintain clock consistency (monotonicity or linearity) across xruns.
Clear. But then the question remains how Jack computes the number
of frames that get added to frame_time when it resumes running cycles.
As you say the only thing that can be relied on is the microseconds
time, since its source is external to Jack. So the obvious solution
is to use
delta_frames = delta_time * sample_rate
I don't think you can find anything simpler than that.
The current value is not even the nearest multiple of a period in
some cases (and why should it be a multiple of a period at all).
Ciao,
--
A world of exhaustive, reliable metadata would be an utopia.
It's also a pipe-dream, founded on self-delusion, nerd hubris
and hysterically inflated market opportunities. (Cory Doctorow)