Design of a Conventional-Transistor-Based Analog Integrated Circuit for On-Chip Learning in a Spiking Neural Network

Spiking Neural Network (SNN) is of special interest among the various NN algorithms. While the SNN has been trained before using the biologically plausible Spike Time Dependent Plasticity (STDP) rule on conventional computers and specialized digital neuromorphic chips, analog hardware is more suitable for such temporal-domain-based training. Further, complete memory-computing intertwining can be achieved with analog hardware only, thereby removing the von Neumann bottleneck altogether. Hence, in this paper, we have designed an analog hardware SNN with neuron and synapse blocks, which are based on conventional silicon transistors and capacitors. Our synapse-block exhibits STDP rule for weight-update. Our neuron-block follows the biologically plausible Leaky Integrate Fire (LIF) model with mechanism for spike generation and homoeostasis property. Then we have shown on-chip learning on a popular Machine Learning (ML) data-set (Fisher’s Iris) through SPICE simulations of the complete hardware. Previous reports on analog hardware SNN mostly involve the use of emerging devices, which are much harder to fabricate compared to conventional silicon transistors that are used here. Those reports also do not show iterative learning through full circuit-level design and simulation of the SNN, unlike this paper. Moreover, other existing reports of analog SNN, which use conventional transistors, do not show training on popularly used ML data-sets, unlike this paper.

[1]  Kaushik Roy,et al.  Going Deeper in Spiking Neural Networks: VGG and Residual Architectures , 2018, Front. Neurosci..

[2]  Joseph S. Friedman,et al.  Magnetic domain wall neuron with lateral inhibition , 2018, Journal of Applied Physics.

[3]  Gert Cauwenberghs,et al.  Neuromorphic Silicon Neuron Circuits , 2011, Front. Neurosci.

[4]  Chiara Bartolozzi,et al.  Synaptic Dynamics in Analog VLSI , 2007, Neural Computation.

[5]  G. Bi,et al.  Synaptic Modifications in Cultured Hippocampal Neurons: Dependence on Spike Timing, Synaptic Strength, and Postsynaptic Cell Type , 1998, The Journal of Neuroscience.

[6]  H. Sompolinsky,et al.  The tempotron: a neuron that learns spike timing–based decisions , 2006, Nature Neuroscience.

[7]  Tayfun Gokmen,et al.  Training Deep Convolutional Neural Networks with Resistive Cross-Point Devices , 2017, Front. Neurosci..

[8]  Yusuf Leblebici,et al.  Neuromorphic computing with multi-memristive synapses , 2017, Nature Communications.

[9]  Peter Dayan,et al.  Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems , 2001 .

[10]  Weisheng Zhao,et al.  Low-Power (1T1N) Skyrmionic Synapses for Spiking Neuromorphic Systems , 2018, IEEE Access.

[11]  Upasana Sahu,et al.  Spike time dependent plasticity (STDP) enabled learning in spiking neural networks using domain wall based synapses and neurons , 2019, AIP Advances.

[12]  Kaushik Roy,et al.  Encoding Neural and Synaptic Functionalities in Electron Spin: A Pathway to Efficient Neuromorphic Computing , 2017, ArXiv.

[13]  Matthew Cook,et al.  Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[14]  Hong Wang,et al.  Loihi: A Neuromorphic Manycore Processor with On-Chip Learning , 2018, IEEE Micro.

[15]  Kaushik Roy,et al.  Hybrid Spintronic-CMOS Spiking Neural Network With On-Chip Learning: Devices, Circuits and Systems , 2015, ArXiv.

[16]  R. Fisher THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS , 1936 .

[17]  D. Linden,et al.  The other side of the engram: experience-driven changes in neuronal intrinsic excitability , 2003, Nature Reviews Neuroscience.

[18]  P. Narayanan,et al.  Recent progress in analog memory-based accelerators for deep learning , 2018, Journal of Physics D: Applied Physics.

[19]  Giacomo Indiveri,et al.  A reconfigurable on-line learning spiking neuromorphic processor comprising 256 neurons and 128K synapses , 2015, Front. Neurosci..

[20]  MaasWofgang Networks of spiking neurons: the third generation of neural network models , 1997 .

[21]  Kaushik Roy,et al.  Towards spike-based machine intelligence with neuromorphic computing , 2019, Nature.

[22]  E. Vianello,et al.  Fully Integrated Spiking Neural Network with Analog Neurons and RRAM Synapses , 2019, 2019 IEEE International Electron Devices Meeting (IEDM).

[23]  Andrew S. Cassidy,et al.  A million spiking-neuron integrated circuit with a scalable communication network and interface , 2014, Science.

[24]  Wolfgang Maass,et al.  Networks of Spiking Neurons: The Third Generation of Neural Network Models , 1996, Electron. Colloquium Comput. Complex..

[25]  Utkarsh Saxena,et al.  On-chip Learning In A Conventional Silicon MOSFET Based Analog Hardware Neural Network , 2019, 2019 IEEE Biomedical Circuits and Systems Conference (BioCAS).

[26]  Matthew Cook,et al.  Unsupervised learning of digit recognition using spike-timing-dependent plasticity , 2015, Front. Comput. Neurosci..

[27]  Chiara Bartolozzi,et al.  Neuromorphic Electronic Circuits for Building Autonomous Cognitive Systems , 2014, Proceedings of the IEEE.

[28]  Udayan Ganguly,et al.  A simple and efficient SNN and its performance & robustness evaluation method to enable hardware implementation , 2016, ArXiv.

[29]  John Wawrzynek,et al.  Low-Power Silicon Neurons, Axons, and Synapses , 1993 .

[30]  A. van Schaik Building blocks for electronic spiking neural networks. , 2001, Neural networks : the official journal of the International Neural Network Society.

[31]  Gert Cauwenberghs,et al.  Large-Scale Neuromorphic Spiking Array Processors: A Quest to Mimic the Brain , 2018, Front. Neurosci..

[32]  D. Querlioz,et al.  Immunity to Device Variations in a Spiking Neural Network With Memristive Nanodevices , 2013, IEEE Transactions on Nanotechnology.