Dang, too bad. I figured it must not be easy because it's not in the
code already.
Thanks for the links. I'll give them a thorough read..
-Caleb
On Thu, Oct 22, 2015 at 2:38 PM, Paul Davis <paul(a)linuxaudiosystems.com> wrote:
On Thu, Oct 22, 2015 at 5:23 PM, Caleb Crome <caleb(a)crome.org> wrote:
Hi,
I'm building a test application for doing bit-perfect loopback
testing on my sound card drivers. However, since jack uses floating
point types, it seems that I can't get a bit-perfect data in and out
from (16-bit) 0x0000 to 0xFFFF. the LSB seems to get rounded on and
off, and also it seems that jack enforces a range of -32767 to +32767
instead of -32768 to +32767 (for 16-bit audio).
Is it possible that it's as simple as changing
jack_default_audio_sample_t?
nope.
far easier to just write your own ALSA test client.
i wouldn't get too deeply into this issue unless you've already read bjorn:
http://blog.bjornroche.com/2009/12/int-float-int-its-jungle-out-there.html
http://blog.bjornroche.com/2009/12/linearity-and-dynamic-range-in-int.html
this email thread has some good stuff too:
http://lists.apple.com/archives/coreaudio-api/2009/Dec/msg00046.html
but perhaps you've already done that :)