Floating Point Benchmarks

by John Walker

There are many disadvantages to being a balding geezer. In compensation, if you've managed to live through the second half of the twentieth century and been involved in computing, there's bearing personal witness to what happens when a technological transition goes into full-tilt exponential blow-off mode. I'm talking about Moore's Law (actually, more of an observation than a law, since it's predicated on certain physical principles and can't go on forever)—computing power available at constant cost doubling every 18 months or so. I've not only seen this happen, I've—er—profited from it; had the 80286-based IBM PC/AT and its competitors not appeared when they did, Autodesk would have been stillborn as too early to market or drowned out by competitors as we arrived too late.

When Moore's Law (or whatever) is directly connected to your career and your bank account, it's nice to have a little thermometer you can use to see how it's going as the years roll by. This page links to two benchmarks I've used to evaluate computer performance ever since 1980. They focus on things which matter to me—floating point computation speed, evaluation of trigonometric functions, and matrix algebra. If you're interested in text searching or database retrieval speed, you should run screaming from these benchmarks. Hey, they work for me.


Valid XHTML 1.0
by John Walker
December 2nd, 2016