Here is the David Reynolds interview by Sander Olson. Mr. Reynolds is the co-founder and VP of Product Development of Lyric Semiconductor, which has recently emerged from Stealth mode. Lyric has just announced new Intellectual Property based on probabilistic processing. Probabilistic processing is increasingly being used for a wide variety of computing tasks, and Lyric claims massive power and efficiency improvements using their IP. Lyric states that their upcoming GP5 will be able to perform probabilistic computations 1,000x more efficiently than a conventional CPU.
Ben Vigoda thesis paper (209 pages) (H/T reader cclaan)
Question: Your company, Lyric semiconductor, is just emerging from stealth mode. How long has Lyric been operating?
Answer: The technology was originally created by Ben Vigoda at MIT, who wrote his PhD thesis on probability processing. Lyric was founded by Ben and me; our chairman and sole VC is the semiconductor industry veteran Ray Stata. We have so far raised over $20 million from DARPA and Stata Venture Partners. We have already filed 50 patents, and have about 30 employees.
Question: You claim to have a technology designed to greatly improve FLASH memory performance. Why is this important?
Answer: Flash memory chips are storage chips which form the main components of solid -state drives. But as semiconductor scaling continues and memory cells get smaller, the number of bad bits increases. By the 20nm node, there will probably be one error for every 100 bits, which is clearly unacceptable. We have found a way to efficiently solve this problem.
Question: How does Lyric’s approach work?
Answer: Our concept works on probabilities. We use a probabilistic algorithm which is able to determine the probability of a bit being a one or zero with extreme accuracy. By employing this algorithm we can reduce error rates in 20nm flash memories from one bit per hundred to one bit per quadrillion.
Question: But conventional Error-Correcting Code (ECC) methods already exist.
Answer: Yes, but it is not a mainstream solution. Current ECC methods are too large, bandwidth intensive, and power hungry to be cost-effective in the vast majority of FLASH applications. It is for these reasons that ECC is currently limited to only high-end SSDs. By contrast, our approach requires only modest overhead costs.
Question: Can these chips be fabricated using standard CMOS?
Answer: Absolutely. In fact, these ICs only require 3 metal layers, considerably fewer than the 9 or so layers that a modern Intel CPU would have. This technology scales according to Moore’s law and TSMC has had no difficulty fabricating our ICs. The ICs we make are tiny compared to standard chips made on an equivalent process. In fact we already have a second generation chip operating in our labs.
Question: Can this technique be used for more than making more reliable FLASH?
Answer: Improving FLASH memory is only one of the tasks for which probabilistic processing is well suited. A wide variety of applications actually involve probabilistic computation. This was not the case before the 21st century, but narrow AI applications are becoming increasingly common. Most of these applications involve this method of computation.
Question: How exactly is probabilistic processing different than traditional digital computation?
Answer: Probabilistic algorithms assign different weights to various possible solutions to a problem, with the perceived optimal solution receiving the largest weight. In theory, classical computing can perform any computing task, but modern CPUs do probabilistic processing inefficiently. That is why there is a clear need for our chips, which can do probabilistic processing extremely efficiently.
Question: How many applications currently employ probabilistic processing?
Answer: Countless applications employ this technique. Credit card companies use probabilistic algorithms to decide whether to approve transactions. Email systems use it to filter spam. Banks use this technique to decide to approve or decline mortgage requests. Even mobile electronics use this method to discern faces and objects. Although rare before the 21st century, probabilistic computations are becoming ubiquitous in modern computing.
Question: Do many data centers, such as those owned by Amazon and Google, make extensive use of probabilistic processing?
Answer: Data centers in particular would benefit from employing our chips. Many data centers have power bills in the millions of dollars per month range, and so would save substantial sums by employing our technology. Data centers also consume about 1% of the electricity in the U.S. so they are major carbon emitters. So by aggressively employing our technology data centers can reduce their space, size, pollution, and electricity requirements by 10x.
Question: How exactly does Lyric achieve such dramatic power savings?
Answer: In order to compute a probability multiply operation using conventional digital logic, one would need to employ perhaps 500 transistors. By contrast, using a Lyric IC, only a few transistors would be required to get the same result. For an individual using a desktop or mobile device the power savings would be noticeable, but for a large data center the savings could be in the millions of dollars per year range.
Question: How does the efficiency of your probability chips compare with the efficiency of FPGAs, GPUs, and DSPs?
Answer: For any probability task, our technology is clearly superior to any FPGA, GPU, DSP, or CPU solution. This applies to every important metric – power consumption, die area, or speed of computation. This technology isn’t designed to replace CPUs or GPUs, but rather to complement them.
Question: Could this technology be directly integrated into ICs such as CPUs?
Answer: Yes, what we are selling is essentially our IP. Since our technology can be implemented with relatively few transistors, it could easily be integrated into a CPU or GPU die. In that case our probabilistic unit would only end up taking up a small portion of the die.
Question: How many potential applications could take advantage of probabilistic processing?
Answer: The potential number of applications is huge. Genome analysis is one obvious application. The task of genome analysis in many respects is similar to error correction decoding, and the number of complete genomes is doubling every two years. So genome analysis would definitely benefit from our technology. So the list of potential applications is growing rapidly.
A distributed, reconfigurable statistical signal processing apparatus comprises an array of discrete-time analog signal processing circuitry for statistical signal processing based on a local message-passing algorithm and digital configuration circuitry.
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.