[Jack-Devel] Is it possible/reasonable to change jack to an int data type?

Caleb Crome caleb at crome.org
Thu Oct 22 23:23:12 CEST 2015

  I'm building a test application for doing bit-perfect loopback
testing on my sound card drivers.  However, since jack uses floating
point types, it seems that I can't get a bit-perfect data in and out
from (16-bit) 0x0000 to 0xFFFF.  the LSB seems to get rounded on and
off, and also it seems that jack enforces a range of  -32767 to +32767
instead of -32768 to +32767 (for 16-bit audio).

Is it possible that it's as simple as changing jack_default_audio_sample_t?



More information about the Jackaudio mailing list