On Thu, Oct 22, 2015 at 5:23 PM, Caleb Crome <caleb@crome.org> wrote:
Hi,
  I'm building a test application for doing bit-perfect loopback
testing on my sound card drivers.  However, since jack uses floating
point types, it seems that I can't get a bit-perfect data in and out
from (16-bit) 0x0000 to 0xFFFF.  the LSB seems to get rounded on and
off, and also it seems that jack enforces a range of  -32767 to +32767
instead of -32768 to +32767 (for 16-bit audio).

Is it possible that it's as simple as changing jack_default_audio_sample_t?

nope.

far easier to just write your own ALSA test client.

but perhaps you've already done that :)