Moore’s Law is not dead, but it has clearly reached old age, and no fundamental technology has emerged to replace it. Whatever comes next is likely to challenge old assumptions both for technologists and society at large.
That was one of the conclusions from an IEEE symposium here here exploring the 20-year technology horizon.
All three morning panelists here agreed Moore’s Law is approaching an end, and it’s not clear what enabling technology could replace it as an engine of exponential technology growth.
“We are no longer on the exponential free ride called Moore’s Law… so we will look for other opportunities that will not be exponential and then shift to something that will change our assumptions,” said Thomas Sterling, a supercomputer guru and professor at Indiana University.
He predicted new kinds of devices and non Von Neumann computer architectures will be adopted in the next 20 years as today’s chips and computers run out of gas. In the meantime, “we continue to make computers based on ideas from 40 years ago,” he said.
Sterling and others agreed that even with today’s technology, there is still opportunity for orders of magnitude improvements in energy efficiency. “The vast majority of energy cost today is in data movement between chips,” said Sterling.
IBM researcher Wilifried Haensch said neuromorphic systems patterned on the human brain look like the most promising bet for post-CMOS advances. “For pattern recognition in big data, this could be a game changer,” he said. The approach requires a switch to analog computing, a power-hungry technique that requires research in new materials. Current work focuses on creating devices that could mimic a synapse, he said.
Nextbigfuture has closely tracked neuromorphic development. Development seems to be on track to human brain scale neuromorphic systems with 20 billion neurons and 200 trillion synapses in 2019 or 2020.
Carbon nanotubes are seeing a resurgence of interest among researchers. Haensch showed concepts for using them to grow 3D structures on silicon wafers as an alternative to chip stacks using through-silicon vias. “The only fundamental road block is that there are no tools to get a reasonable contact at small dimensions,” he said.
He also discussed other options such as adiabatic and quantum computing that so far seem less promising.
Nextbigfuture believes the potential of adiabatic and quantum computing will become more clear over the next two to eight years
Google is working on variations of superconducting qubits with error correction architectures and longer coherence times.
There are entirely different physical structures that will be applied to quantum computing and several of those will be scaled to try to achieve superior performance.
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.