This article will show that AI projects are getting access to petaflops and exaflops of computing power, which would match the raw compute power of the human brain. However, we still do not have insect-level AI system despite having the raw power for insect AI twenty years ago. AGI is lagging and will likely to continue to lag compute power for AI by 30 years or more.
Individual people lag the compute and software capabilities of the leading technology companies. Individuals can only afford a tiny amount of the compute power, but also lag in true ability to use AI software and the most commercially valuable software.
Individual access to truly powerful means of production often lags the leading edge by 50-100 years. This is repeating with the lag to democratize search and IT automation.
In 2019, the Cerebras CS-1 AI supercomputer was made using the Wafer Scale Engine (WSE). it was the industry’s only trillion transistor processor. The WSE is the largest chip ever made at 46,225 square millimeters in area, it is 56.7 times larger than the largest graphics processing unit. It contains 78 times more AI optimized compute cores, 3,000 times more high speed, on-chip memory, 10,000 times more memory bandwidth, and 33,000 times more communication bandwidth.
However, passing various biological brains in compute power does not mean that the AI industry is able to make synthetic AI that matches everything of smaller biological brains.
Twenty years ago computers surpassed the compute power of insect brains. Insect brains start at about 1000 neurons.
In 2019, DARPA funded a project funded a project to make computing systems as small and efficient as the brains of “very small flying insects.” The Microscale Biomimetic Robust Artificial Intelligence Networks program, or MicroBRAIN, could ultimately result in artificial intelligence systems that can be trained on less data and operated with less energy.
Analyzing insects’ brains, which allow them to navigate the world with minimal information, could also help researchers understand how to build AI systems capable of basic common sense reasoning.
From 2012-2018, the largest AI training runs has been increasing exponentially with a 3.4-month doubling time. This metric has grown by more than 300,000x (a 2-year doubling period would yield only a 7x increase). Improvements in compute have been a key component of AI progress.
In 2020 with a 50 petaflop supercomputer, there could be projects with 4 million petaflop seconds per day of power. The log scale increase in compute power available to AI is still following the 3.4 month doublings.
AI hardware has seen five distinct eras:
Before 2012: It was uncommon to use GPUs for ML, making any of the results in the graph difficult to achieve.
2012 to 2014: Infrastructure to train on many GPUs was uncommon, so most results used 1-8 GPUs rated at 1-2 TFLOPS for a total of 0.001-0.1 pfs-days.
2014 to 2016: Large-scale results used 10-100 GPUs rated at 5-10 TFLOPS, resulting in 0.1-10 pfs-days. Diminishing returns on data parallelism meant that larger training runs had limited value.
2016 to 2017: Approaches that allow greater algorithmic parallelism such as huge batch sizes, architecture search, and expert iteration, along with specialized hardware such as TPU’s and faster interconnects, have greatly increased these limits, at least for some applications.
2018 to 2020: More dedicated AI supercomputers at multi-petaflop scales.
There will clearly be exaflop and multi-exaflop systems using specialized AI hardware in the 2021-2023 timeframe.
A human brain was supposed to be about a petaflop of processing power.
Individuals Lag in Productivity Enhancement
People will not just lag AGIs, people are currently lagging the AI and software-enabled technology companies.
In order for individuals to match up to AI supercomputer enabled corporations, there need to be systems that common person can use for internet search, e-commerce, access to DNA information and analysis. The productivity capabilities that were available to Google in 2000 are not available to individuals.
AI and software agents need to be made available. Individual education is not complete if people do not have the understanding to leverage available technical resources.
The lag in empowering people with technology needs to be reduced.
In 2019, Gartner projected that AI augmentation would create $2.9 trillion in value in 2021. Currently, individuals can mainly benefit from this value creation by buying Google, Facebook and other companies that are most successful at monetizing and taking advantage of superior computing systems.
SOURCES- Open AI, Singularity University, Gartner
Written By Brian Wang, Nextbigfuture.com
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.