Reservoir computing is an approach to make machine learning algorithms run faster. The word reservoir refers to a dynamical system. The reservoir consists of a bunch of recurrently connected units that are connected randomly. The main forms of reservoir computing use a recurrent neural network, and instead of updating all parameters of the network, it only updates some of the parameters and keeps the other parameters fixed after choosing them randomly. Reservoir computing (RC) is good at processing temporal or sequential kind of data.
* RC transforms the sequential inputs nonlinearly into a high-dimensional space so that the features of the inputs can be efficiently read out by a simple learning algorithm.
* RC systems have easy and fast training
Reservoir computing has been around in different forms for decades.
The use of physical RC is where complex physical systems are used to transform the inputs into output answers. This physical system can be biological, liquid or many other systems. Reservoir Computing is completely different from regular forms of digital computing. One of the reasons I, Brian Wang, am reviewing this field of work and trying to summarize its difficult concepts is that a company in Canada is claiming to have developed a quantum analog computing system that builds upon reservoir computing work. Many other academic institutions are trying to leverage quantum systems and superconducting systems to build Reservoir Computing.
Random recurrent neural networks can be trained to produce complex behaviors mimicking input/output relationships of recurrent neural networks in the brain. The important thing here is that these networks can produce complex temporal dynamics (even in the absence of input) unlike the static feedforward neural networks we discussed before.
Reservoirs store past inputs. There were liquid state machines and echo state networks.
Reservoir computing has similarities to Artificial neural networks but instead of backpropagation, it does forward evolution.
Instead of adjusting network node weights, it uses a blackbox of any physical system and train against inputs and outputs.
There do not need to be weights and nodes in the reservoirs unlike regular deep neural networks.
Quantum reservoir computing may utilize the nonlinear nature of quantum mechanical interactions or processes to form the characteristic nonlinear reservoirs but may also be done with linear reservoirs when the injection of the input to the reservoir creates the nonlinearity. The marriage of machine learning and quantum devices is leading to the emergence of quantum neuromorphic computing as a new research area.
Because any physical system can be used there are many variations on reservoir computing and quantum reservoir computing.
2-D quantum dot lattices
In this architecture, randomized coupling between lattice sites grants the reservoir the “black box” property inherent to reservoir processors. The reservoir is then excited, which acts as the input, by an incident optical field. Readout occurs in the form of occupational numbers of lattice sites, which are naturally nonlinear functions of the input.
Nuclear spins in a molecular solid
In this architecture, quantum mechanical coupling between spins of neighboring atoms within the molecular solid provides the non-linearity required to create the higher-dimensional computational space. The reservoir is then excited by radiofrequency electromagnetic radiation tuned to the resonance frequencies of relevant nuclear spins. Readout occurs by measuring the nuclear spin states.
Reservoir computing on gate-based near-term superconducting quantum computers
The most prevalent model of quantum computing is the gate-based model where quantum computation is performed by sequential applications of unitary quantum gates on qubits of a quantum computer. A theory for the implementation of reservoir computing on a gate-based quantum computer with proof-of-principle demonstrations on a number of IBM superconducting noisy intermediate-scale quantum (NISQ) computers has been reported.
The massive flexibility of reservoir computing suggests interesting emergent intelligence or solution generation systems from complexity and chaos.
They have forward evolution and not backward propagation.
SOURCES- Wikipedia, REND-REU Program, Zachary Kilpatrick, Fields Institute, Analytics India Mag
Written by Brian Wang, Nextbigfuture.com
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.