Optimized spiking neurons can classify images with high accuracy through temporal coding with two spikes

Spike-based neuromorphic hardware is a promising option for reducing the energy consumption of image classification and other deep learning applications. A drastic reduction of this energy consumption is especially needed for implementing state-of-the-art results of deep learning in mobile phones or other edge devices. However direct training of deep spiking neural networks is difficult, and previous methods for converting trained artificial neural networks to spiking neurons were inefficient because the neurons had to emit too many spikes. We show that a substantially more efficient conversion arises when one optimizes the spiking neuron model for that purpose, so that it not only matters for information transmission how many spikes a neuron emits, but also when it emits those spikes. This advances the accuracy that can be achieved for image classification with spiking neurons, and the resulting networks need on average just two spikes per neuron for classifying an image. In addition, our new conversion method drastically improves latency and throughput of the resulting spiking networks.

[1]  Kaushik Roy,et al.  Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures , 2019, Frontiers in Neuroscience.

[2]  Kaushik Roy,et al.  Going Deeper in Spiking Neural Networks: VGG and Residual Architectures , 2018, Front. Neurosci..

[3]  Wolfgang Maass,et al.  Emulation of Hopfield networks with spiking neurons in temporal coding , 1998 .

[4]  Li Fei-Fei,et al.  ImageNet: A large-scale hierarchical image database , 2009, CVPR.

[5]  Ojas Parekh,et al.  Constant-Depth and Subcubic-Size Threshold Circuits for Matrix Multiplication , 2018, SPAA.

[6]  Enhua Wu,et al.  Squeeze-and-Excitation Networks , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Evangelos Eleftheriou,et al.  Deep learning incorporating biologically inspired neural dynamics and in-memory computing , 2020, Nature Machine Intelligence.

[8]  Arnaud Delorme,et al.  Spike-based strategies for rapid processing , 2001, Neural Networks.

[9]  S. Grant,et al.  A combinatorial postsynaptic molecular mechanism converts patterns of nerve impulses into the behavioral repertoire , 2018 .

[10]  H. Markram,et al.  Interneurons of the neocortical inhibitory system , 2004, Nature Reviews Neuroscience.

[11]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[12]  Shih-Chii Liu,et al.  Conversion of analog to spiking neural networks using sparse temporal coding , 2018, 2018 IEEE International Symposium on Circuits and Systems (ISCAS).

[13]  Garrick Orchard,et al.  Neuromorphic Nearest Neighbor Search Using Intel's Pohoiki Springs , 2020, NICE.

[14]  Wolfgang Maass,et al.  Recognizing Images with at most one Spike per Neuron , 2020, ArXiv.

[15]  Brian R. Lee,et al.  Classification of electrophysiological and morphological neuron types in the mouse visual cortex , 2019, Nature Neuroscience.

[16]  Christof Koch,et al.  Evolution of cellular diversity in primary motor cortex of human, marmoset monkey, and mouse , 2020, bioRxiv.

[17]  Peter Sterling,et al.  Principles of Neural Design , 2015 .

[18]  Shih-Chii Liu,et al.  Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification , 2017, Front. Neurosci..

[19]  Quoc V. Le,et al.  EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks , 2019, ICML.

[20]  Quoc V. Le,et al.  Searching for Activation Functions , 2018, arXiv.

[21]  G. Buzsáki,et al.  Spike train dynamics predicts theta-related phase precession in hippocampal pyramidal cells , 2002, Nature.

[22]  Andreas Hartel,et al.  Versatile emulation of spiking neural networks on an accelerated neuromorphic substrate , 2020, ISCAS.

[23]  Graham D. Riley,et al.  Estimation of energy consumption in machine learning , 2019, J. Parallel Distributed Comput..

[24]  Kaushik Roy,et al.  RMP-SNN: Residual Membrane Potential Neuron for Enabling Deeper High-Accuracy and Low-Latency Spiking Neural Network , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[25]  Hong Wang,et al.  Loihi: A Neuromorphic Manycore Processor with On-Chip Learning , 2018, IEEE Micro.

[26]  Meng Zhang,et al.  Recent Advances in Convolutional Neural Network Acceleration , 2018, Neurocomputing.

[27]  Kevin Gimpel,et al.  Gaussian Error Linear Units (GELUs) , 2016 .

[28]  Wolfgang Maass,et al.  Fast Sigmoidal Networks via Spiking Neurons , 1997, Neural Computation.

[29]  Kaushik Roy,et al.  Enabling Spike-based Backpropagation in State-of-the-art Deep Neural Network Architectures , 2019 .

[30]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[31]  Alex Krizhevsky,et al.  Learning Multiple Layers of Features from Tiny Images , 2009 .

[32]  Wulfram Gerstner,et al.  Neuronal Dynamics: From Single Neurons To Networks And Models Of Cognition , 2014 .

[33]  I. Ial,et al.  Nature Communications , 2010, Nature Cell Biology.

[34]  Pietro Perona,et al.  Building a bird recognition app and large scale dataset with citizen scientists: The fine print in fine-grained dataset collection , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[35]  Wolfgang Maass,et al.  A solution to the learning dilemma for recurrent networks of spiking neurons , 2019, Nature Communications.

[36]  Michael S. Bernstein,et al.  ImageNet Large Scale Visual Recognition Challenge , 2014, International Journal of Computer Vision.

[37]  H. Wildner,et al.  Morphological, biophysical and synaptic properties of glutamatergic neurons of the mouse spinal dorsal horn , 2014, The Journal of physiology.

[38]  Kaushik Roy,et al.  Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation , 2020, ICLR.

[39]  Steve B. Furber,et al.  The SpiNNaker Project , 2014, Proceedings of the IEEE.

[40]  Bernabé Linares-Barranco,et al.  Conversion of Synchronous Artificial Neural Network to Asynchronous Spiking Neural Network using sigma-delta quantization , 2019, 2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS).

[41]  Saeed Reza Kheradpisheh,et al.  S4NN: temporal backpropagation for spiking neural networks with one spike per neuron , 2020, Int. J. Neural Syst..