T2FSNN: Deep Spiking Neural Networks with Time-to-first-spike Coding

Spiking neural networks (SNNs) have gained considerable interest due to their energy-efficient characteristics, yet lack of a scalable training algorithm has restricted their applicability in practical machine learning problems. The deep neural network-to-SNN conversion approach has been widely studied to broaden the applicability of SNNs. Most previous studies, however, have not fully utilized spatio-temporal aspects of SNNs, which has led to inefficiency in terms of number of spikes and inference latency. In this paper, we present T2FSNN, which introduces the concept of time-to-first-spike coding into deep SNNs using the kernel-based dynamic threshold and dendrite to overcome the aforementioned drawback. In addition, we propose gradient-based optimization and early firing methods to further increase the efficiency of the T2FSNN. According to our results, the proposed methods can reduce inference latency and number of spikes to 22% and less than 1%, compared to those of burst coding, which is the state-of-the-art result on the CIFAR-100.

[1]  Sungroh Yoon,et al.  Spiking-YOLO: Spiking Neural Network for Real-time Object Detection , 2019, ArXiv.

[2]  Steve B. Furber,et al.  The SpiNNaker Project , 2014, Proceedings of the IEEE.

[3]  Matthew Cook,et al.  Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[4]  Wolfgang Maass,et al.  Networks of Spiking Neurons: The Third Generation of Neural Network Models , 1996, Electron. Colloquium Comput. Complex..

[5]  Wofgang Maas,et al.  Networks of spiking neurons: the third generation of neural network models , 1997 .

[6]  E. Adrian,et al.  The impulses produced by sensory nerve endings , 1926, The Journal of physiology.

[7]  Shengyuan Zhou,et al.  TDSNN: From Deep Neural Networks to Deep Spike Neural Networks with Temporal-Coding , 2019, AAAI.

[8]  Andrew S. Cassidy,et al.  A million spiking-neuron integrated circuit with a scalable communication network and interface , 2014, Science.

[9]  Arnaud Delorme,et al.  Spike-based strategies for rapid processing , 2001, Neural Networks.

[10]  Sungroh Yoon,et al.  Fast and Efficient Information Transmission with Burst Spikes in Deep Spiking Neural Networks , 2018, 2019 56th ACM/IEEE Design Automation Conference (DAC).

[11]  Wenrui Zhang,et al.  Hybrid Macro/Micro Level Backpropagation for Training Deep Spiking Neural Networks , 2018, NeurIPS.

[12]  Lei Deng,et al.  Direct Training for Spiking Neural Networks: Faster, Larger, Better , 2018, AAAI.

[13]  Kiyoung Choi,et al.  Deep neural networks with weighted spikes , 2018, Neurocomputing.

[14]  Song Han,et al.  Learning both Weights and Connections for Efficient Neural Network , 2015, NIPS.

[15]  Shih-Chii Liu,et al.  Conversion of analog to spiking neural networks using sparse temporal coding , 2018, 2018 IEEE International Symposium on Circuits and Systems (ISCAS).

[16]  Sungroh Yoon,et al.  Quantized Memory-Augmented Neural Networks , 2017, AAAI.

[17]  N. Logothetis,et al.  Phase-of-Firing Coding of Natural Visual Stimuli in Primary Visual Cortex , 2008, Current Biology.

[18]  Shih-Chii Liu,et al.  Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification , 2017, Front. Neurosci..

[19]  Quoc V. Le,et al.  EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks , 2019, ICML.

[20]  Chun-I Yeh,et al.  Temporal precision in the neural code and the timescales of natural vision , 2007, Nature.