On Wed, 08 Jun 2005 09:50 , Simon Jenkins <sjenkins(a)blueyonder.co.uk> sent:
On Tue, 2005-06-07 at 19:34 -0500, Jan Depner wrote:
On Tue, 2005-06-07 at 19:20, Dave Robillard
wrote:
"Premature optimization is the root of all
evil". Using C arrays and
strings for no reason when a much more robust higher level type would
suffice is /just as stupid/ as always using slow high-level operations
in time critical code.
It's like arguing about, say, assembly vs. perl. Anyone who says one
side is (always) "better" is automatically wrong. :)
True. I usually try to use the right tool for the job.
Unfortunately, with the data I work with, the right tool is almost
always the fastest tool.
You must be working with relatively large amounts of relatively simple
data then.
I'm working with multibeam sonar, airborne topographic and hydrographic
LIDAR, and airborne hyperspectral imagery data.
Suppose I sum a vector of 5 million integers and it
takes 6 seconds. And
assume - (generously![1]) - that I switch to using an array and now it
only takes 1 second. Hmmm... a 6 * speedup! So I look to see where else
my code could benefit from this super performance boost.
Aha! Here's a vector of 5,000 oscillator structures, and it takes 5
seconds to initialise them all. Switch to using an array and... erm...
now it only takes 4.995 seconds to initialise them all.
I'm sure my users will notice and appreciate the 5ms saving, but I
suspect I could have served them better by looking at where my code was
actually spending most of its time before trying to "optimise" it.
As far as data volumes go, for your 5 million integers, you're off by about 5
orders of magnitude ;-) So, now that 5ms just became 500 seconds. Yes, my users
do notice and appreciate that time savings ;-)
Jan
Cheers
Simon
[1] It really was a generous assumption: I've assumed that arrays are