[LAD] Denormals / subnormals (again)

Carl Hetherington lists at carlh.net
Tue Jan 3 01:22:11 UTC 2012


Hi all,

I have been looking into ye olde denormal problem a little, lately. 
Particularly with respect to Ardour and plugins.  I've assembled what I 
believe to be a coherent statement of what is going on, but I'd very much 
appreciate any corrections and clarifications that anyone can offer.

Here's how it seems to me:

If you compile your code (e.g. a plugin) without -msse and -mfpmath=sse on 
the GCC command line you get *no* protection from denormals from the CPU. 
If they occur in your code, they will be very much slower than normal 
floating point numbers (~49 times slower on my Core 2 Duo, ~7 times slower 
on a Core i3).  As far as I can see, it does not matter that Ardour has 
been built with those flags: if a plugin has not, you have no protection.

If you compile your code with -msse and -mfpmath=sse, you have Ardour's 
protection from denormals.  If the user's CPU supports it, you get 
there is no significant slowdown with denormals using this mode.  However 
CPU support is "some later processors with SSE2", according to Intel.

The problem, I guess, is that we cannot really distribute plugins with SSE 
instructions, otherwise we do not support people with older CPUs.  In this 
case, I think the plugin code must avoid denormals, otherwise there will 
be a significant performance hit if they arise.

I've been testing behaviour using a very dumb program which you can get 
from http://carlh.net/software/denormals.tar.gz

I've also been testing plugins using a primitive torture tester that you 
can get from http://carlh.net/software/ or on github via 
git at github.com:cth103/plugin-torture.git

Any comments?

Best

Carl




More information about the Linux-audio-dev mailing list