CSNN: An Augmented Spiking based Framework with Perceptron-Inception

Spiking Neural Networks (SNNs) represent and transmit information in spikes, which is considered more biologically realistic and computationally powerful than the traditional Artificial Neural Networks. The spiking neurons encode useful temporal information and possess highly antinoise property. The feature extraction ability of typical SNNs is limited by shallow structures. This paper focuses on improving the feature extraction ability of SNNs in virtue of powerful feature extraction ability of Convolutional Neural Networks (CNNs). CNNs can extract abstract features resorting to the structure of the convolutional feature maps. We propose a CNN-SNN (CSNN) model to combine feature learning ability of CNNs with cognition ability of SNNs. The CSNN model learns the encoded spatiotemporal representations of images in an event-driven way. We evaluate the CSNN model on the MNIST and its variants, including learning capabilities, encoding mechanisms, robustness to noisy stimuli and its classification performance. The results show that CSNN behaves well compared to other cognitive models with significantly fewer neurons and training samples. Our work brings more biological realism into modern image classification models, with the hope that these models can inform how the brain performs this high-level vision task.

[1]  Yoshua Bengio,et al.  An empirical evaluation of deep architectures on problems with many factors of variation , 2007, ICML '07.

[2]  Dharmendra S. Modha,et al.  A digital neurosynaptic core using embedded crossbar memory with 45pJ per spike in 45nm , 2011, 2011 IEEE Custom Integrated Circuits Conference (CICC).

[3]  Shaista Hussain,et al.  Improved margin multi-class classification using dendritic neurons with morphological learning , 2014, 2014 IEEE International Symposium on Circuits and Systems (ISCAS).

[4]  Michael J. Watts,et al.  IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS Publication Information , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[5]  Michael J. Berry,et al.  Refractoriness and Neural Precision , 1997, The Journal of Neuroscience.

[6]  A. Hodgkin,et al.  A quantitative description of membrane current and its application to conduction and excitation in nerve , 1952, The Journal of physiology.

[7]  Tomaso Poggio,et al.  Intracellular measurements of spatial integration and the MAX operation in complex cells of the cat primary visual cortex. , 2004, Journal of neurophysiology.

[8]  Matthew Cook,et al.  Unsupervised learning of digit recognition using spike-timing-dependent plasticity , 2015, Front. Comput. Neurosci..

[9]  Gang Pan,et al.  SparseConnect: regularising CNNs on fully connected layers , 2017 .

[10]  Shih-Chii Liu,et al.  Computation with Spikes in a Winner-Take-All Network , 2009, Neural Computation.

[11]  Eugene M. Izhikevich,et al.  Resonate-and-fire neurons , 2001, Neural Networks.

[12]  D. Hubel,et al.  Receptive fields and functional architecture of monkey striate cortex , 1968, The Journal of physiology.

[13]  Haizhou Li,et al.  A Spike-Timing-Based Integrated Model for Pattern Recognition , 2013, Neural Computation.

[14]  D. Signorini,et al.  Neural networks , 1995, The Lancet.

[15]  Haizhou Li,et al.  Rapid Feedforward Computation by Temporal Encoding and Learning With Spiking Neurons , 2013, IEEE Transactions on Neural Networks and Learning Systems.

[16]  Peter Tino,et al.  IEEE Transactions on Neural Networks , 2009 .

[17]  Y. Istefanopulos,et al.  IEEE Engineering in Medicine and Biology Society , 2019, IEEE Transactions on Biomedical Engineering.

[18]  Haizhou Li,et al.  How the Brain Formulates Memory: A Spatio-Temporal Model Research Frontier , 2016, IEEE Computational Intelligence Magazine.

[19]  D. Koshland Frontiers in neuroscience. , 1988, Science.

[20]  Tobi Delbruck,et al.  Real-time classification and sensor fusion with a spiking deep belief network , 2013, Front. Neurosci..

[21]  E. Chichilnisky,et al.  Precision of spike trains in primate retinal ganglion cells. , 2004, Journal of neurophysiology.

[22]  Martin A. Giese,et al.  Biophysiologically Plausible Implementations of the Maximum Operation , 2002, Neural Computation.

[23]  Simon J. Thorpe,et al.  Sparse spike coding in an asynchronous feed-forward multi-layer neural network using matching pursuit , 2004, Neurocomputing.

[24]  Ming Zhang,et al.  Darwin: A neuromorphic hardware co-processor based on spiking neural networks , 2017, J. Syst. Archit..

[25]  Eugene M. Izhikevich,et al.  Simple model of spiking neurons , 2003, IEEE Trans. Neural Networks.

[26]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[27]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[28]  M. V. Rossum,et al.  In Neural Computation , 2022 .

[29]  D. Fayuk,et al.  The Journal of Physiology , 1978, Medical History.

[30]  Gang Pan,et al.  Character recognition from trajectory by recurrent spiking neural networks , 2017, 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[31]  H E M Journal of Neurophysiology , 1938, Nature.