A Plasticity-Centric Approach to Train the Non-Differential Spiking Neural Networks

Many efforts have been taken to train spiking neural networks (SNNs), but most of them still need improvements due to the discontinuous and non-differential characteristics of SNNs. While the mammalian brains solve these kinds of problems by integrating a series of biological plasticity learning rules. In this paper, we will focus on two biological plausible methodologies and try to solve these catastrophic training problems in SNNs. Firstly, the biological neural network will try to keep a balance between inputs and outputs on both the neuron and the network levels. Secondly, the biological synaptic weights will be passively updated by the changes of the membrane potentials of the neighbour-hood neurons, and the plasticity of synapses will not propagate back to other previous layers. With these biological inspirations, we propose Voltage-driven Plasticity-centric SNN (VPSNN), which includes four steps, namely: feed forward inference, unsupervised equilibrium state learning, supervised last layer learning and passively updating synaptic weights based on spiketiming dependent plasticity (STDP). Finally we get the accuracy of 98.52% on the hand-written digits classification task on MNIST. In addition, with the help of a visualization tool, we try to analyze the black box of SNN and get better understanding of what benefits have been acquired by the proposed method.

[1]  Matthew Cook,et al.  Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[2]  D. Hassabis,et al.  Neuroscience-Inspired Artificial Intelligence , 2017, Neuron.

[3]  D. Querlioz,et al.  Immunity to Device Variations in a Spiking Neural Network With Memristive Nanodevices , 2013, IEEE Transactions on Nanotechnology.

[4]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[5]  Matthew Cook,et al.  Unsupervised learning of digit recognition using spike-timing-dependent plasticity , 2015, Front. Comput. Neurosci..

[6]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[7]  Yoshua Bengio,et al.  Show, Attend and Tell: Neural Image Caption Generation with Visual Attention , 2015, ICML.

[8]  Wofgang Maas,et al.  Networks of spiking neurons: the third generation of neural network models , 1997 .

[9]  Tobi Delbrück,et al.  Training Deep Spiking Neural Networks Using Backpropagation , 2016, Front. Neurosci..

[10]  Yi Zeng,et al.  HMSNN: Hippocampus inspired Memory Spiking Neural Network , 2016, 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[11]  Moritz Helmstaedter,et al.  The Mutual Inspirations of Machine Learning and Neuroscience , 2015, Neuron.

[12]  Jingzhou Liu,et al.  Visualizing Large-scale and High-dimensional Data , 2016, WWW.

[13]  Y. Dan,et al.  Spike Timing-Dependent Plasticity of Neural Circuits , 2004, Neuron.

[14]  Bartlett W. Mel,et al.  Dendrites: bug or feature? , 2003, Current Opinion in Neurobiology.

[15]  Yoshua Bengio,et al.  Towards Biologically Plausible Deep Learning , 2015, ArXiv.

[16]  Tobi Delbruck,et al.  Real-time classification and sensor fusion with a spiking deep belief network , 2013, Front. Neurosci..

[17]  Sander M. Bohte,et al.  Error-backpropagation in temporally encoded networks of spiking neurons , 2000, Neurocomputing.

[18]  Yoshua Bengio,et al.  Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation , 2016, Front. Comput. Neurosci..

[19]  Demis Hassabis,et al.  Mastering the game of Go with deep neural networks and tree search , 2016, Nature.

[20]  Shane Legg,et al.  Human-level control through deep reinforcement learning , 2015, Nature.

[21]  Bernabé Linares-Barranco,et al.  Feedforward Categorization on AER Motion Events Using Cortex-Like Features in a Spiking Neural Network , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[22]  Yi Zeng,et al.  Improving multi-layer spiking neural networks by incorporating brain-inspired rules , 2017, Science China Information Sciences.

[23]  Dharmendra S. Modha,et al.  A digital neurosynaptic core using embedded crossbar memory with 45pJ per spike in 45nm , 2011, 2011 IEEE Custom Integrated Circuits Conference (CICC).

[24]  Shaista Hussain,et al.  Improved margin multi-class classification using dendritic neurons with morphological learning , 2014, 2014 IEEE International Symposium on Circuits and Systems (ISCAS).

[25]  Everton J. Agnes,et al.  Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks , 2015, Nature Communications.

[26]  G. Bi,et al.  Synaptic modification by correlated activity: Hebb's postulate revisited. , 2001, Annual review of neuroscience.

[27]  Yi Zeng,et al.  HCNN: A Neural Network Model for Combining Local and Global Features Towards Human-Like Classification , 2016, Int. J. Pattern Recognit. Artif. Intell..

[28]  Geoffrey E. Hinton,et al.  Stochastic Neighbor Embedding , 2002, NIPS.