Spike trains encoding and threshold rescaling method for deep spiking neural networks

Deep neural networks such as convolutional neural networks (CNNs) have achieved a great success in a broad range of fields. Spiking neural networks (SNNs) are designed as a solution for realizing ultra-low power consumption using spikebased neuromorphic hardware. To enable the mapping between conventional artificial neural networks (ANNs) and spike-based neuromorphic hardware, direct conversion from conventional ANNs into SNNs has recently been proposed. However, performance loss is hard to avoid after conversion. To reduce the loss, we analyze the encoding methods of SNNs and the optimization methods for converting. We use the rate coding and approximate the output of an activation function of CNNs by the number of spikes produced by a neuron within a given time window in SNNs. We propose a method for generating fixed uniform spike train whose number of spikes is as exact as we expect, and present an optimization method to reduce the loss by rescaling the threshold of each layer in SNNs. For evaluation, we train three different CNNs with MNIST and SVHN datasets separately and convert all of them into SNNs. The networks achieve a max accuracy of 99.17% on MNIST and of 93.36% on additional training dataset of SVHN. The results show the proposed fixed and uniform spike train not only outperforms the Poisson distributed spike train but also costs less time to generate. Our threshold rescaling method greatly improves the performance of SNNs.

[1]  Timothée Masquelier,et al.  STDP-based spiking deep neural networks for object recognition , 2016, Neural Networks.

[2]  Tobi Delbruck,et al.  Real-time classification and sensor fusion with a spiking deep belief network , 2013, Front. Neurosci..

[3]  Hang Li,et al.  Convolutional Neural Network Architectures for Matching Natural Language Sentences , 2014, NIPS.

[4]  Andrew S. Cassidy,et al.  Conversion of artificial recurrent neural networks to spiking neural networks for low-power neuromorphic hardware , 2016, 2016 IEEE International Conference on Rebooting Computing (ICRC).

[5]  Xiaolin Hu,et al.  Recurrent convolutional neural network for object recognition , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[6]  Shane Legg,et al.  Human-level control through deep reinforcement learning , 2015, Nature.

[7]  Matthew Cook,et al.  Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[8]  Jürgen Schmidhuber,et al.  Deep learning in neural networks: An overview , 2014, Neural Networks.

[9]  Tara N. Sainath,et al.  FUNDAMENTAL TECHNOLOGIES IN MODERN SPEECH RECOGNITION Digital Object Identifier 10.1109/MSP.2012.2205597 , 2012 .

[10]  Giacomo Indiveri,et al.  Memory and Information Processing in Neuromorphic Systems , 2015, Proceedings of the IEEE.

[11]  Deepak Khosla,et al.  Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition , 2014, International Journal of Computer Vision.

[12]  Andrew S. Cassidy,et al.  Convolutional networks for fast, energy-efficient neuromorphic computing , 2016, Proceedings of the National Academy of Sciences.

[13]  Haizhou Li,et al.  A Spiking Neural Network System for Robust Sequence Recognition , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[14]  Dumitru Erhan,et al.  Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[15]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[16]  Yoshua Bengio,et al.  Maxout Networks , 2013, ICML.

[17]  Haizhou Li,et al.  How the Brain Formulates Memory: A Spatio-Temporal Model Research Frontier , 2016, IEEE Computational Intelligence Magazine.

[18]  Haizhou Li,et al.  Rapid Feedforward Computation by Temporal Encoding and Learning With Spiking Neurons , 2013, IEEE Transactions on Neural Networks and Learning Systems.

[19]  J. A. Movshon,et al.  The dependence of response amplitude and variance of cat visual cortical neurones on stimulus contrast , 1981, Experimental Brain Research.

[20]  D. Querlioz,et al.  Immunity to Device Variations in a Spiking Neural Network With Memristive Nanodevices , 2013, IEEE Transactions on Nanotechnology.

[21]  Gert Cauwenberghs,et al.  Event-driven contrastive divergence for spiking neuromorphic systems , 2013, Front. Neurosci..

[22]  Xiang Zhang,et al.  OverFeat: Integrated Recognition, Localization and Detection using Convolutional Networks , 2013, ICLR.

[23]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[24]  Rodrigo Alvarez-Icaza,et al.  Neurogrid: A Mixed-Analog-Digital Multichip System for Large-Scale Neural Simulations , 2014, Proceedings of the IEEE.

[25]  Bernabé Linares-Barranco,et al.  Feedforward Categorization on AER Motion Events Using Cortex-Like Features in a Spiking Neural Network , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[26]  Walter Senn,et al.  Learning Real-World Stimuli in a Neural Network with Spike-Driven Synaptic Dynamics , 2007, Neural Computation.

[27]  Tobi Delbrück,et al.  Training Deep Spiking Neural Networks Using Backpropagation , 2016, Front. Neurosci..