Conversion of Synchronous Artificial Neural Network to Asynchronous Spiking Neural Network using sigma-delta quantization

Artificial Neural Networks (ANNs) show great performance in several data analysis tasks including visual and auditory applications. However, direct implementation of these algorithms without considering the sparsity of data requires high processing power, consume vast amounts of energy and suffer from scalability issues. Inspired by biology, one of the methods which can reduce power consumption and allow scalability in the implementation of neural networks is asynchronous processing and communication by means of action potentials, so-called spikes. In this work, we use the well-known sigma-delta quantization method and introduce an easy and straightforward solution to convert an Artificial Neural Network to a Spiking Neural Network which can be implemented asynchronously in a neuromorphic platform. Briefly, we used asynchronous spikes to communicate the quantized output activations of the neurons. Despite the fact that our proposed mechanism is simple and applicable to a wide range of different ANNs, it outperforms the state-of-the-art implementations from the accuracy and energy consumption point of view. All source code for this project is available upon request for the academic purpose1.

[1]  Ali Farhadi,et al.  XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks , 2016, ECCV.

[2]  Bernabe Linares-Barranco,et al.  Comparison between Frame-Constrained Fix-Pixel-Value and Frame-Free Spiking-Dynamic-Pixel ConvNets for Visual Processing , 2012, Front. Neurosci..

[3]  Bernabé Linares-Barranco,et al.  An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data , 2017, Front. Neurosci..

[4]  Lei Deng,et al.  Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks , 2017, Front. Neurosci..

[5]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[6]  Hesham Mostafa,et al.  Supervised Learning Based on Temporal Coding in Spiking Neural Networks , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[7]  Bernabé Linares-Barranco,et al.  Performance Comparison of Time-Step-Driven versus Event-Driven Neural State Update Approaches in SpiNNaker , 2018, 2018 IEEE International Symposium on Circuits and Systems (ISCAS).

[8]  Kaushik Roy,et al.  Going Deeper in Spiking Neural Networks: VGG and Residual Architectures , 2018, Front. Neurosci..

[9]  Gert Cauwenberghs,et al.  Fast classification using sparsely active spiking networks , 2017, 2017 IEEE International Symposium on Circuits and Systems (ISCAS).

[10]  Andrew S. Cassidy,et al.  A million spiking-neuron integrated circuit with a scalable communication network and interface , 2014, Science.

[11]  Tobi Delbrück,et al.  Training Deep Spiking Neural Networks Using Backpropagation , 2016, Front. Neurosci..

[12]  Marian Verhelst,et al.  An Energy-Efficient Precision-Scalable ConvNet Processor in 40-nm CMOS , 2017, IEEE Journal of Solid-State Circuits.

[13]  Shih-Chii Liu,et al.  Conversion of analog to spiking neural networks using sparse temporal coding , 2018, 2018 IEEE International Symposium on Circuits and Systems (ISCAS).

[14]  Rufin van Rullen,et al.  Rate Coding Versus Temporal Order Coding: What the Retinal Ganglion Cells Tell the Visual Cortex , 2001, Neural Computation.

[15]  J. Mink,et al.  Ratio of central nervous system to body metabolism in vertebrates: its constancy and functional basis. , 1981, The American journal of physiology.

[16]  Hong Wang,et al.  Loihi: A Neuromorphic Manycore Processor with On-Chip Learning , 2018, IEEE Micro.

[17]  Steve B. Furber,et al.  The SpiNNaker Project , 2014, Proceedings of the IEEE.

[18]  Bernhard E. Boser,et al.  The Design of Sigma-Delta Modulation Analog-to-Digit a 1 Converters , 2004 .

[19]  Sana'a W. Al-Sayegh Hybrid Neural Network , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[20]  Shih-Chii Liu,et al.  Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification , 2017, Front. Neurosci..