IBM Scientists Imitate the Functionality of Neurons with a Phase-Change Device and each neuron update uses less than five picojoules

IBM scientists have created randomly spiking neurons using phase-change materials to store and process data. This demonstration marks a significant step forward in the development of energy-efficient, ultra-dense integrated neuromorphic technologies for applications in cognitive computing.

Inspired by the way the biological brain functions, scientists have theorized for decades that it should be possible to imitate the versatile computational capabilities of large populations of neurons. However, doing so at densities and with a power budget that would be comparable to those seen in biology has been a significant challenge, until now.

“We have been researching phase-change materials for memory applications for over a decade, and our progress in the past 24 months has been remarkable,” said IBM Fellow Evangelos Eleftheriou. “In this period, we have discovered and published new memory techniques, including projected memory, stored 3 bits per cell in phase-change memory for the first time, and now are demonstrating the powerful capabilities of phase-change-based artificial neurons, which can perform various computational primitives such as data-correlation detection and unsupervised learning at high speeds using very little energy.”

Nanotechnology journal – All-memristive neuromorphic computing with level-tuned neurons

The artificial neurons designed by IBM scientists in Zurich consist of phase-change materials, including germanium antimony telluride, which exhibit two stable states, an amorphous one (without a clearly defined structure) and a crystalline one (with structure). These materials are the basis of re-writable Blu-ray discs. However, the artificial neurons do not store digital information; they are analog, just like the synapses and neurons in our biological brain.

In the published demonstration, the team applied a series of electrical pulses to the artificial neurons, which resulted in the progressive crystallization of the phase-change material, ultimately causing the neuron to fire. In neuroscience, this function is known as the integrate-and-fire property of biological neurons. This is the foundation for event-based computation and, in principle, is similar to how our brain triggers a response when we touch something hot.

Exploiting this integrate-and-fire property, even a single neuron can be used to detect patterns and discover correlations in real-time streams of event-based data. For example, in the Internet of Things, sensors can collect and analyze volumes of weather data collected at the edge for faster forecasts. The artificial neurons could be used to detect patterns in financial transactions to find discrepancies or use data from social media to discover new cultural trends in real time. Large populations of these high-speed, low-energy nano-scale neurons could also be used in neuromorphic coprocessors with co-located memory and processing units.

IBM scientists have organized hundreds of artificial neurons into populations and used them to represent fast and complex signals. Moreover, the artificial neurons have been shown to sustain billions of switching cycles, which would correspond to multiple years of operation at an update frequency of 100 Hz. The energy required for each neuron update was less than five picojoule and the average power less than 120 microwatts — for comparison, 60 million microwatts power a 60 watt lightbulb.

“Populations of stochastic phase-change neurons, combined with other nanoscale computational elements such as artificial synapses, could be a key enabler for the creation of a new generation of extremely dense neuromorphic computing systems,” said Tomas Tuma, a co-author of the paper.

Summary and future work

With this paper, their contribution is two-fold.

1. they presented how the single-neuron building block of a SNN can be realized with nanoscale phase-change devices in an all-memristive configuration.

2. this computational primitive was incorporated into a neuromorphic architecture enhanced with the biologically-inspired scheme of level-tuning neurons.

Theydemonstrated experimentally that the proposed all-memristive neuromorphic architecture is capable of learning multiple correlations from a large number of input streams in an unsupervised manner.

Inherent characteristics of phase-change memristors, such as multilevel storage, accumulation, and state-dependent dynamics, render them a promising technology for all-memristive neuro-synaptic implementations. Combining the above with their proven characteristics of high speed, low energy and excellent scalability developed in emerging memory applications, an all-memristive computational primitive offers all the key features for application in large-scale neuromorphic systems. The experimental studies presented in this work demonstrate that despite the simplified neuro-synaptic implementation and the inherent variability of the phase-change cells, the memristive components provide the required neuromorphic functionality. However, open issues related to the interconnectivity and the integration of the memristive components in a neuromorphic processor chip, remain to be addressed. Further work should also discuss the algorithmic implications of the variability, stochasticity and storage resolution of the memristive neurons and synapses in more complex neural network configurations.

In biology, architectures with highly variable components can adapt and perform specialized operations in a robust way. Key features of biological systems are the best source of inspiration for building compact and efficient artificial neural systems for computing applications, such as big data analytics and sensory information processing. In this paper, inspired by specialized coders of low-sound levels in the auditory cortex, they incorporated the level-tuned neuron approach for detecting multiple correlated input patterns. To implement level-tuned neurons, state information already present in the neuronal structure was exploited. This information provided powerful insights into the characteristics of the input data streams, which proved instrumental for the experimental demonstration of multiple pattern detection. Therefore, they believe that this development could function as the basis for further research in enhanced large-scale neural network configurations.

Abstract

In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.

SOURCES- IBM, Youtube, Nanotechnology