While Thomas Sterling’s interview about the impossibility of reaching zettaflops made a lot of sense, the history of making negative predictions about technology is often an embarrassing one.
Note – Thomas Sterling left himself an out that entirely new architectures could achieve zettaflops. So John Barr and Thomas Sterling are in general agreement.
If we wind back the clock to the days of megaflops, there were no commodity microprocessors (i.e,. the killer micros that put paid to many proprietary architectures), there were no multicore processor. Indeed the Cray-1 was a single processor machine. There was no OpenMP, no MPI and compute accelerators were the size of a fridge and cost tens of thousands of dollars.
Who would have thought that today’s HPC systems would use compute accelerators the size of a paperback book that were millions of times more powerful and cost a small fraction of the price? And I’ve lost count of how many times I’ve been told that the next generation of microprocessors will be the last major advance as the photolithography techniques used to manufacture chips had reached a limit, beyond which decreasing the size of devices was impossible. The industry has achieved the impossible before, and will do so again.
Moore’s Law, which states that the number of transistors placed on an integrated circuit would double every two years, is often understood to mean that performance will double every two years (some say 18 months). What started life as an observation, has become the target that marketing men guarantee and engineering budgets are set against. And the straight line graphs that technologists use to predict the future suggest that zettaflops systems will be built around the year 2030.
Professor Sterling pioneered the use of compute clusters and is a Gordon Bell Prize winner. He has excellent credentials in HPC, and I can’t refute a single fact that he put forward in his interview — indeed, I am generally in full agreement with insights on the issues the industry faces — but I am certain that he is wrong in his conclusion.