Accelerating Spiking Neural Networks using Memristive Crossbar Arrays

Biologically-inspired spiking neural networks (SNNs) hold great promise to perform demanding tasks in an energy and area-efficient manner. Memristive devices organized in a crossbar array can be used to accelerate operations of artificial neural networks (ANNs) while circumventing limitations of traditional computing paradigms. Recent advances have led to the development of neuromorphic accelerators that employ phase-change memory (PCM) devices. We propose an approach to fully unravel the potential of such systems for SNNs by integrating entire layers, including synaptic weights as well as neuronal states, into crossbar arrays. However, the key challenges of such realizations originate from the intrinsic imperfections of the PCM devices that limit their effective precision. Thus, we investigated the impact of these limitations on the performance of SNNs and demonstrate that the synaptic weight and neuronal state realization using 4-bit precision provides a robust network performance. Moreover, we evaluated the scheme for a multi-layer SNN realized using an experimentally verified model of the PCM devices and achieved performance that is comparable to a floating-point 32-bit model.

[1]  Wolfgang Maass,et al.  Networks of Spiking Neurons: The Third Generation of Neural Network Models , 1996, Electron. Colloquium Comput. Complex..

[2]  Pritish Narayanan,et al.  Neuromorphic computing using non-volatile memory , 2017 .

[3]  Hong Wang,et al.  Loihi: A Neuromorphic Manycore Processor with On-Chip Learning , 2018, IEEE Micro.

[4]  Karlheinz Meier,et al.  A mixed-signal universal neuromorphic computing system , 2015, 2015 IEEE International Electron Devices Meeting (IEDM).

[5]  Catherine E. Graves,et al.  Memristor‐Based Analog Computation and Neural Network Classification with a Dot Product Engine , 2018, Advanced materials.

[6]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[7]  Evangelos Eleftheriou,et al.  Deep learning incorporating biologically inspired neural dynamics and in-memory computing , 2020, Nature Machine Intelligence.

[8]  Manuel Le Gallo,et al.  Memory devices and applications for in-memory computing , 2020, Nature Nanotechnology.

[9]  Carver A. Mead,et al.  Neuromorphic electronic systems , 1990, Proc. IEEE.

[10]  E. Eleftheriou,et al.  A phase-change memory model for neuromorphic computing , 2018, Journal of Applied Physics.

[11]  Yusuf Leblebici,et al.  Neuromorphic computing with multi-memristive synapses , 2017, Nature Communications.

[12]  Yoshua Bengio,et al.  Modeling Temporal Dependencies in High-Dimensional Sequences: Application to Polyphonic Music Generation and Transcription , 2012, ICML.

[13]  E. Eleftheriou,et al.  All-memristive neuromorphic computing with level-tuned neurons , 2016, Nanotechnology.

[14]  C. Hagleitner,et al.  Device, circuit and system-level analysis of noise in multi-bit phase-change memory , 2010, 2010 International Electron Devices Meeting.

[15]  Robert A. Legenstein,et al.  Long short-term memory and Learning-to-learn in networks of spiking neurons , 2018, NeurIPS.