Toward One-Shot Learning in Neuroscience-Inspired Deep Spiking Neural Networks

Conventional deep neural networks capture essential information processing stages in perception. Deep neural networks often require very large volume of training examples, whereas children can learn concepts such as hand-written digits with few examples. The goal of this project is to develop a deep spiking neural network that can learn from few training trials. Using known neuronal mechanisms, a spiking neural network model is developed and trained to recognize hand-written digits with presenting one to four training examples for each digit taken from the MNIST database. The model detects and learns geometric features of the images from MNIST database. In this work, a novel biological back-propagation based learning rule is developed and used to a train the network to detect basic features of different digits. For this purpose, randomly initialized synaptic weights between the layers are being updated. By using a neuroscience inspired mechanism named ‘synaptic pruning’ and a predefined threshold, some of the synapses through the training are deleted. Hence, information channels are constructed that are highly specific for each digit as matrix of synaptic connections between two layers of spiking neural networks. These connection matrixes named ‘information channels’ are used in the test phase to assign a digit class to each test image. As similar to humans’ abilities to learn from small training trials, the developed spiking neural network needs a very small dataset for training, compared to conventional deep learning methods checked on MNIST dataset.

[1]  Emerging Trends in Retrograde Signaling , 2016, Molecular Neurobiology.

[2]  Nikola Kasabov,et al.  Audio- and Visual Information Processing in the Brain and Its Modelling with Evolving SNN , 2018, Springer Series on Bio- and Neurosystems.

[3]  Antonio Bicchi,et al.  Combined Sensing, Cognition, Learning, and Control for Developing Future Neuro-Robotics Systems: A Survey , 2019, IEEE Transactions on Cognitive and Developmental Systems.

[4]  Tobi Delbrück,et al.  Retinomorphic Event-Based Vision Sensors: Bioinspired Cameras With Spiking Output , 2014, Proceedings of the IEEE.

[5]  Gopalakrishnan Srinivasan,et al.  Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning , 2018, Front. Neurosci..

[6]  S. Eickhoff,et al.  A network view on brain regions involved in experts’ object and pattern recognition: Implications for the neural mechanisms of skilled visual perception , 2019, Brain and Cognition.

[7]  János Brunner,et al.  Analogue modulation of back-propagating action potentials enables dendritic hybrid signalling , 2016, Nature Communications.

[8]  Isaac Meilijson,et al.  Synaptic Pruning in Development: A Computational Account , 1998, Neural Computation.

[9]  Timothy P Lillicrap,et al.  Towards deep learning with segregated dendrites , 2016, eLife.

[10]  John W. Phillips,et al.  Sparse recurrent excitatory connectivity in the microcircuit of the adult mouse and human cortex , 2018, bioRxiv.

[11]  Anthony S. Maida,et al.  A Minimal Spiking Neural Network to Rapidly Train and Classify Handwritten Digits in Binary and 10-Digit Tasks , 2015 .

[12]  Mohammad Badrul Alam Miah,et al.  Handwritten Digit Recognition Using Machine Learning Algorithms , 2018 .

[13]  James C. R. Whittington,et al.  Theories of Error Back-Propagation in the Brain , 2019, Trends in Cognitive Sciences.

[14]  Qingxiang Wu,et al.  Information Processing Functionality of Spiking Neurons for Image Feature Extraction , 2007 .

[15]  M. Giustetto,et al.  Synaptic Pruning by Microglia Is Necessary for Normal Brain Development , 2011, Science.

[16]  Robert A. Legenstein,et al.  Long short-term memory and Learning-to-learn in networks of spiking neurons , 2018, NeurIPS.

[17]  Timothée Masquelier,et al.  Deep Learning in Spiking Neural Networks , 2018, Neural Networks.

[18]  C F Ross,et al.  Primary sensorimotor cortex exhibits complex dependencies of spike-field coherence on neuronal firing rates, field power, and behavior. , 2018, Journal of neurophysiology.

[19]  D. Hassabis,et al.  Neuroscience-Inspired Artificial Intelligence , 2017, Neuron.

[20]  Michael Pfeiffer,et al.  Deep Learning With Spiking Neurons: Opportunities and Challenges , 2018, Front. Neurosci..

[21]  Volkmar Lessmann,et al.  Back-propagating action potential , 2008, Communicative & integrative biology.

[22]  Shimon Ullman,et al.  Using neuroscience to develop artificial intelligence , 2019, Science.

[23]  T. Südhof,et al.  Towards an Understanding of Synapse Formation , 2018, Neuron.

[24]  Faramarz Faghihi,et al.  Combined Computational Systems Biology and Computational Neuroscience Approaches Help Develop of Future “Cognitive Developmental Robotics” , 2017, Front. Neurorobot..

[25]  Luca Maria Gambardella,et al.  Deep Big Simple Neural Nets Excel on Handwritten Digit Recognition , 2010, ArXiv.

[26]  Bipin Rajendran,et al.  Spiking neural networks for handwritten digit recognition - Supervised learning and network optimization , 2018, Neural Networks.

[27]  Antoine Dupret,et al.  Event-Based, Timescale Invariant Unsupervised Online Deep Learning With STDP , 2018, Front. Comput. Neurosci..

[28]  Sander M. Bohte,et al.  Error-backpropagation in temporally encoded networks of spiking neurons , 2000, Neurocomputing.

[29]  Bin Deng,et al.  Training Spiking Neural Networks for Cognitive Tasks: A Versatile Framework Compatible to Various Temporal Codes , 2017, ArXiv.

[30]  Florentin Wörgötter,et al.  A computational model of conditioning inspired by Drosophila olfactory system , 2017, Neural Networks.

[31]  Ziv Bar-Joseph,et al.  Network Design and the Brain , 2018, Trends in Cognitive Sciences.

[32]  Luca Maria Gambardella,et al.  Deep, Big, Simple Neural Nets for Handwritten Digit Recognition , 2010, Neural Computation.

[33]  Juan Martín Carpio Valadez,et al.  Evolutionary Spiking Neural Networks for Solving Supervised Classification Problems , 2019, Comput. Intell. Neurosci..

[34]  Wulfram Gerstner,et al.  Reinforcement Learning Using a Continuous Time Actor-Critic Framework with Spiking Neurons , 2013, PLoS Comput. Biol..

[35]  Taghi M. Khoshgoftaar,et al.  Deep learning applications and challenges in big data analytics , 2015, Journal of Big Data.

[36]  James C. R. Whittington,et al.  Theories of Error Back-Propagation in the Brain , 2019, Trends in Cognitive Sciences.

[37]  Matthew Cook,et al.  Unsupervised learning of digit recognition using spike-timing-dependent plasticity , 2015, Front. Comput. Neurosci..

[38]  Nina Vogt Machine learning in neuroscience , 2018, Nature Methods.

[39]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[40]  T. Poggio,et al.  Hierarchical models of object recognition in cortex , 1999, Nature Neuroscience.

[41]  L. Deng,et al.  The MNIST Database of Handwritten Digit Images for Machine Learning Research [Best of the Web] , 2012, IEEE Signal Processing Magazine.

[42]  Jacob Chakareski Bridging Social and Data Networks [Applications Corner] , 2012, IEEE Signal Process. Mag..

[43]  Tobi Delbrück,et al.  Training Deep Spiking Neural Networks Using Backpropagation , 2016, Front. Neurosci..

[44]  Faramarz Faghihi,et al.  A computational model of pattern separation efficiency in the dentate gyrus with implications in schizophrenia , 2015, Front. Syst. Neurosci..

[45]  F. Helmchen,et al.  Background Synaptic Activity Is Sparse in Neocortex , 2006, The Journal of Neuroscience.