From: Jens M Andreasen
<jens.andreasen(a)chello.se>
As of lately, GPUs have turned into very flexible massive
vectorprocessors. Precision is 32bit floats (but you can trade it down
to get more work done)
Yes, it could be that the "invention" implements the algorithms
trivially in GPU -- like anyone would implement when getting started
in to GPU programming.
It is like Nokia mobile connected to the wall power outlet.
The Nokia mobile does not take power directly from the wall because
that "invention" is patented. Nokia mobile is forced to take
power from the battery (which could be next to dead). All this
even devices have taken power directly from the wall for decades.
GPU is yet another new gear to play with, just like mobiles.
Anything can be patented, I'm afraid.
PS: You need not worry about patents for performing an
algorithm on a
processor. Especially not when the vendor supplies you with a C-like
languge for doing exactly that.
The article says quite clearly that the invention is patented.
They would be fools not to try to patent it because the market
is huge.
Neat idea, been meaning to try it since the OpenGL2
shader compilers
were released.
I have had years this idea of using graphics texture functions to
generate audio noises and signals. This app runs naturally on GPU.
Juhana
--
http://music.columbia.edu/mailman/listinfo/linux-graphics-dev
for developers of open source graphics software