On Tue, 2006-06-13 at 16:17 -0400, I. E. Smith-Heisters wrote:
I've futzed around a bit more. I'd done this
before, but I'd forgotten
the exact results except that it didn't work. Tried all this both with
and without RT and 16bit mode forced:
upping frames/period to 4096 reduces the number of xruns to several/second.
upping periods/buffer to 3 still gives xruns, as well as "usecs
exceeds estimated spare time" messages.
upping periods/buffer to 4 makes initialization fail with "ALSA: got
smaller periods 2 than 4 for playback"
putting it into non-duplex (ie. playback only) has no effect on behavior.
So, yeah, that's why it's mysterious. In the past I sacrifice latency
for no xruns, and everything's dandy. Not so, this time...
Thanks for the suggestions.
Are you saying that RT mode has no effect on the xruns? I find this
hard to believe.
Check the messages from JACK - maybe it's failing to set RT mode (thisis
a bug that's fixed in the development tree).
Try these tests in RT mode as root to be sure.
Are you using the proprietary ATI or Nvidia drivers?
Lee