On Thu, Jan 2, 2014 at 6:49 AM, Bill Gribble <grib(a)billgribble.com> wrote:
Wait a minute. This discussion is making my head
spin! How is there any
way at all that increasing the sampling rate, and changing nothing else,
will improve the lowest reliable latency of a system?
Fons' point, which I believe started this, was that the latency caused by
A/D and D/A converters is reduced (or can be reduced) when using a higher
SR. Nothing more. When you're already using very small buffer sizes at the
CPU level, reducing these additional delays in the analog conversion
process can be signficant.
There is an additional point that some devices cannot be configured to use
a buffer size below a given value, and so even if you system could handle
the lower latency setting, the only way to get there is to double the SR.