Mathematics and Statistics Colloquium
Friday (10/31/2008) at 2:30pm in 304 Pickard Hall
"To Resist the Resurgence of Roundoff"Abstract: Nobody keeps score, so nobody can know how often scientific and engineering computations in floating- point suffer embarrassing (if found out) anomalies due to roundoff. They cannot be negligible; some known examples generate uneasiness.
The 1960's advances in error-analysis had combined with the 1980's standardization of floating-point arithmetic to promote the promulgation of numerical software via MATLAB and other packages so reliable as to render their users complacent. Alas, current trends could set us back half a century to an era when only the naive trusted an uncorroborated numerical computation.
Desperate efforts to achieve speed via sometimes massive parallelism have motivated algorithms not yet known to be numerically stable for all data. If occasionally a result from such a program arouses suspicion, it will most likely be misdiagnosed. Such programs have become uneconomical to debug because compilers and software development environments lack support for capabilities still built into the most widespread hardware but almost unexercised and consequently threatened by atrophy.
Considering how meager is the supply of error-analysts, an attractive alternative to the impractical debugging of freakish anomalies induced by roundoff is to render them too rare to matter by employing routinely (i.e., by default) arithmetic extravagantly more precise than is needed to hold the input data or the desired output. Extra-precise arithmetic is still built into the most widespread hardware but so ill-supported in programming languages and compilers that it is almost unexercised and consequently threatened by atrophy. How many users of computers are aware of what they are losing?
The market for floating-point arithmetic now goes almost entirely for entertainment and communication tolerant of arithmetic anomalies at levels intolerable to scientists and engineers most of whom can afford only mass-produced computers. Can the momentum of the market be resisted?
Since around 1970 it has been known that every group is the full automorphism of some infinite 'free' projective plane. We demonstrate that a finite version of this result applies to a class of finite combinatorial designs with parameters generalizing those of a projective plane.
Professor William Kahan received his B.A. in Mathematics in 1954 and M.A. in Numerical Analysis and Computing in 1956 and PhD in Numerical Analysis in 1958, all from University of Toronto. After two years in Cambridge University as a postdoctoral fellow and ten years in University of Toronto as a faculty member, he joined University of California at Berkeley in 1969 first as Professor of Computer Science and then of Computer Science and Mathematics. Currently he is Professor Emeritus of Mathematics and of E.E. & Computer Science at UC Berkeley.
Professor Kahan is the main architect of the IEEE 754 floating point arithmetic standard for which he won the 1989 ACM Turing Award, widely regarded as the Computer Science equivalent of Nobel Prize. His many honors include Dedication of IEEE ARITH 17 Symposium on Computer Arithmetic (2005), Distinguished Mentor of Undergraduate Research in the College of Letters & Science (UC Berkeley, 2004), IEEE Emanuel R. Piore Award (2000), SIAM 1997 John von Neumann Memorial Lecture, SIAM SIAG Best Paper in Applied Linear Algebra (1991), and ACM 1st George Forsythe Memorial Award (1971). Professor Kahan is a member of American Academy of Arts & Sciences, a foreign (Canadian) associate of National Academy of Engineering, and an ACM Fellow.