Polymer memristor used as basis for a perceptron

The perceptron invented by Frank Rosenblatt in 1957 and popularized in Marvin Minsky and Seymour Papert’s 1969 book Perceptrons: An Introduction to Computational Geometry may no longer be just theory.

Russian and Italian scientists, led by Vyacheslav Demin at the Moscow Institute of Physics and Technology and the National Research Center Kurchatov Institute (Moscow) have described a perceptron in detail in a paper by Demin and his collaborators titled Hardware elementary perceptron based on polyaniline memristive devices.

The most exciting element of their perceptron was the polymer-memristor they constructed from organic polyaniline (PANI)—a highly conductive polymer that has been used as the active electronic component in experimental non-volatile memories

The Russian memristor is fabricated from polyaniline and has already been proven capable of realizing Marvin Minsky’s Perceptron. (Source: Moscow Institute of Physics and Technology)

Organic Electronics – Hardware elementary perceptron based on polyaniline memristive devices

Highlights
• Elementary perceptron is realized on polyaniline-based memristive devices.
• Training procedure is demonstrated for perceptron to perform NAND and NOR functions.
• Possibility of realization of neural network with organic memristive links is shown.

Abstract
Elementary perceptron is an artificial neural network with a single layer of adaptive links and one output neuron that can solve simple linearly separable tasks such as invariant pattern recognition, linear approximation, prediction and others. We report on the hardware realization of the elementary perceptron with the use of polyaniline-based memristive devices as the analog link weights. An error correction algorithm was used to get the perceptron to learn the implementation of the NAND and NOR logic functions as examples of linearly separable tasks. The physical realization of an elementary perceptron demonstrates the ability to form the hardware-based neuromorphic networks with the use of organic memristive devices. The results provide a great promise toward new approaches for very compact, low-volatile and high-performance neurochips that could be made for a huge number of intellectual products and applications.

Hows it works
Polyaniline memristors have been demonstrated before, individually and for non-volatile memories, but the Russian and Italian scientists claim their implementation is the first that formed into a genuine analog neural network–a single layer perceptron. Their memristors were fabricated at the millimeter scale for convenience, using a polyaniline solution, a glass substrate, and chromium electrodes, but the researchers claim that within five years they could be manufacturable at 10-nanometers–rivaling silicon chips.

The memristor prototype is still quite large (the coin is about half the size of a U.S. penny) but the researchers say it can be downsized to 10 nanometers. (Source: Moscow Institute of Physics and Technology)


When characterizing the polyaniline memristors, they found that they had a natural hysteresis built-in, a very desirable quality for digital non-volatile memories. For analog applications, the hysteresis was found by the researchers to be mild enough to enable memristors to operate in the middle analog range of the total hysteresis curve. As a result, the polyaniline memristors were able to emulate the function of the brain’s synapses between its neurons–that is become more conductive the more they are used, and atrophying down to zero when current flows in the opposite direction.

To prove their point, the researchers trained the perceptron to learn both the digital NAND and NOR functions, as well as other linear separable operations, invariant pattern recognition and linear approximations. A standard back-propagating error-correction algorithm–which sends error signals backwards to reduce the conductivity of the memristive synapse, allowed the neural network to learn, giving the researchers hope that multi-layer versions in the future will be able to emulate deep-learning tasks at a much higher speed than they are simulated on digital computers today.

Next, beside downsizing to the nanoscale, the researchers also intend to implement multi-layer deep learning neural networks using the third dimension–stacking network layers vertically into 3-D structures.