[Conrad Berhörster]
i have written a channel class, which collects data
from (file) sources and
copies it to a buffer. Jack will get this buffer and put it into his streams.
So far, I think, this is a normal design with the following code
for (unsigned int n = 0; n < nframes; ++n)
{
pBuffer[n] += pFrames[n];
pBuffer[n] *= volume;
}
i know, it's not really optimized. But it works as an example. as you can
think, pBuffer and pFrames are float* and volume is also a float.
now it happens, when the volume is 0, after 3-5 seconds, the CPU will run into
100%.
a workaround is the following before the loop
if(volume < 0.0000001)
volume = 0.0000001;
But i try to understand, what happens here. Is the compiler overoptimizing
zeros.
It'd be a lot more helpful to see the full source but from what you're
writing I'd be willing to bet you're encountering denormals.
Cheers, Tim