Towards Scalable, Efficient and Accurate Deep Spiking Neural Networks with Backward Residual Connections, Stochastic Softmax and Hybridization

Spiking Neural Networks (SNNs) may offer an energy-efficient alternative for implementing deep learning applications. In recent years, there have been several proposals focused on supervised (conversion, spike-based gradient descent) and unsupervised (spike timing dependent plasticity) training methods to improve the accuracy of SNNs on large-scale tasks. However, each of these methods suffer from scalability, latency, and accuracy limitations. In this paper, we propose novel algorithmic techniques of modifying the SNN configuration with backward residual connections, stochastic softmax, and hybrid artificial-and-spiking neuronal activations to improve the learning ability of the training methodologies to yield competitive accuracy, while, yielding large efficiency gains over their artificial counterparts. Note, artificial counterparts refer to conventional deep learning/artificial neural networks. Our techniques apply to VGG/Residual architectures, and are compatible with all forms of training methodologies. Our analysis reveals that the proposed solutions yield near state-of-the-art accuracy with significant energy-efficiency and reduced parameter overhead translating to hardware improvements on complex visual recognition tasks, such as, CIFAR10, Imagenet datatsets.

[1]  J. Yang,et al.  Memristors with diffusive dynamics as synaptic emulators for neuromorphic computing. , 2017, Nature materials.

[2]  Song Han,et al.  Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding , 2015, ICLR.

[3]  Jyrki Alakuijala,et al.  Temporal Coding in Spiking Neural Networks with Alpha Synaptic Function , 2019, ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[4]  M. Marinella,et al.  A non-volatile organic electrochemical device as a low-voltage artificial synapse for neuromorphic computing. , 2017, Nature materials.

[5]  Emre Neftci,et al.  Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-based optimization to spiking neural networks , 2019, IEEE Signal Processing Magazine.

[6]  Robert A. Legenstein,et al.  Long short-term memory and Learning-to-learn in networks of spiking neurons , 2018, NeurIPS.

[7]  Song Han,et al.  Learning both Weights and Connections for Efficient Neural Network , 2015, NIPS.

[8]  Chris Eliasmith,et al.  Spiking Deep Networks with LIF Neurons , 2015, ArXiv.

[9]  Mingguo Zhao,et al.  Towards artificial general intelligence with hybrid Tianjic chip architecture , 2019, Nature.

[10]  Hesham Mostafa,et al.  Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-based optimization to spiking neural networks , 2019, IEEE Signal Processing Magazine.

[11]  Bernabé Linares-Barranco,et al.  On Spike-Timing-Dependent-Plasticity, Memristive Devices, and Building a Self-Learning Visual Cortex , 2011, Front. Neurosci..

[12]  Bernabé Linares-Barranco,et al.  On neuromorphic spiking architectures for asynchronous STDP memristive systems , 2010, Proceedings of 2010 IEEE International Symposium on Circuits and Systems.

[13]  A. Krizhevsky Convolutional Deep Belief Networks on CIFAR-10 , 2010 .

[14]  Kaushik Roy,et al.  Encoding Neural and Synaptic Functionalities in Electron Spin: A Pathway to Efficient Neuromorphic Computing , 2017, ArXiv.

[15]  Kaushik Roy,et al.  Enabling Spike-based Backpropagation in State-of-the-art Deep Neural Network Architectures , 2019 .

[16]  Jason M. Allred,et al.  ASP: Learning to Forget With Adaptive Synaptic Plasticity in Spiking Neural Networks , 2017, IEEE Journal on Emerging and Selected Topics in Circuits and Systems.

[17]  Kaushik Roy,et al.  RESPARC: A reconfigurable and energy-efficient architecture with Memristive Crossbars for deep Spiking Neural Networks , 2017, 2017 54th ACM/EDAC/IEEE Design Automation Conference (DAC).

[18]  Ying Chen,et al.  Direct training based spiking convolutional neural networks for object recognition , 2019, ArXiv.

[19]  Kaushik Roy,et al.  Unsupervised regenerative learning of hierarchical features in Spiking Deep Networks for object recognition , 2016, 2016 International Joint Conference on Neural Networks (IJCNN).

[20]  Matthew Cook,et al.  Unsupervised learning of digit recognition using spike-timing-dependent plasticity , 2015, Front. Comput. Neurosci..

[21]  Ran El-Yaniv,et al.  Binarized Neural Networks , 2016, NIPS.

[22]  Kaushik Roy,et al.  Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures , 2019, Frontiers in Neuroscience.

[23]  Kaushik Roy,et al.  Going Deeper in Spiking Neural Networks: VGG and Residual Architectures , 2018, Front. Neurosci..

[24]  Tobi Delbruck,et al.  Real-time classification and sensor fusion with a spiking deep belief network , 2013, Front. Neurosci..

[25]  Craig M. Vineyard,et al.  Training deep neural networks for binary communication with the Whetstone method , 2019 .

[26]  Saeed Reza Kheradpisheh,et al.  S4NN: temporal backpropagation for spiking neural networks with one spike per neuron , 2020, Int. J. Neural Syst..

[27]  Timothée Masquelier,et al.  Competitive STDP-Based Spike Pattern Learning , 2009, Neural Computation.

[28]  Gopalakrishnan Srinivasan,et al.  Deep Spiking Convolutional Neural Network Trained With Unsupervised Spike-Timing-Dependent Plasticity , 2019, IEEE Transactions on Cognitive and Developmental Systems.

[29]  G. Indiveri,et al.  Neuromorphic architectures for spiking deep neural networks , 2015, 2015 IEEE International Electron Devices Meeting (IEDM).

[30]  Gopalakrishnan Srinivasan,et al.  Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning , 2018, Front. Neurosci..

[31]  Deepak Khosla,et al.  Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition , 2014, International Journal of Computer Vision.

[32]  Aran Nayebi,et al.  CORnet: Modeling the Neural Mechanisms of Core Object Recognition , 2018, bioRxiv.

[33]  Tobi Delbrück,et al.  Training Deep Spiking Neural Networks Using Backpropagation , 2016, Front. Neurosci..

[34]  Chris Eliasmith,et al.  A Spike in Performance: Training Hybrid-Spiking Neural Networks with Quantized Activation Functions , 2020, ArXiv.

[35]  Yuan Yu,et al.  TensorFlow: A system for large-scale machine learning , 2016, OSDI.

[36]  Dumitru Erhan,et al.  Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[37]  Kaushik Roy,et al.  ReStoCNet: Residual Stochastic Binary Convolutional Spiking Neural Network for Memory-Efficient Neuromorphic Computing , 2019, Front. Neurosci..

[38]  Li Fei-Fei,et al.  ImageNet: A large-scale hierarchical image database , 2009, CVPR.

[39]  Timothée Masquelier,et al.  Unsupervised Learning of Visual Features through Spike Timing Dependent Plasticity , 2007, PLoS Comput. Biol..

[40]  Matthew Cook,et al.  Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[41]  Kaushik Roy,et al.  Hybrid Spintronic-CMOS Spiking Neural Network With On-Chip Learning: Devices, Circuits and Systems , 2015, ArXiv.

[42]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[43]  Kaushik Roy,et al.  STDP-based Unsupervised Feature Learning using Convolution-over-time in Spiking Neural Networks for Energy-Efficient Neuromorphic Computing , 2018, ACM J. Emerg. Technol. Comput. Syst..

[44]  Kaushik Roy,et al.  Towards spike-based machine intelligence with neuromorphic computing , 2019, Nature.

[45]  Sen Lu,et al.  Exploring the Connection Between Binary and Spiking Neural Networks , 2020, Frontiers in Neuroscience.

[46]  Hesham Mostafa,et al.  Supervised Learning Based on Temporal Coding in Spiking Neural Networks , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[47]  Michael Pfeiffer,et al.  Deep Learning With Spiking Neurons: Opportunities and Challenges , 2018, Front. Neurosci..

[48]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[49]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[50]  Giacomo Indiveri,et al.  Frontiers in Neuromorphic Engineering , 2011, Front. Neurosci..

[51]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[52]  S. Thorpe,et al.  STDP-based spiking deep convolutional neural networks for object recognition , 2018 .

[53]  Eunho Yang,et al.  DropMax: Adaptive Variational Softmax , 2017, NeurIPS.