A Pulse-gated, Neural Implementation of the Backpropagation Algorithm

For some time, it has been thought that backpropagation of errors could not be implemented in biophysiologically realistic neural circuits. This belief was largely due to either 1) the need for symmetric replication of feedback and feedforward weights, 2) the need for differing forms of activation between forward and backward propagating sweeps, and 3) the need for a separate network for error gradient computation and storage, on the one hand, or 4) nonphysiological backpropagation through the forward propagating neurons themselves, on the other. In this paper, we present spiking neuron mechanisms for gating pulses to maintain short-term memories, controlling forward inference and backward error propagation, and coordinating learning of feedback and feedforward weights. These neural mechanisms are synthesized into a new backpropagation algorithm for neuromorphic circuits.

[1]  Konrad P. Körding,et al.  Supervised and Unsupervised Learning with Two Sites of Synaptic Integration , 2001, Journal of Computational Neuroscience.

[2]  J. DiCarlo,et al.  Using goal-driven deep learning models to understand sensory cortex , 2016, Nature Neuroscience.

[3]  Johannes Schemmel,et al.  A wafer-scale neuromorphic hardware system for large-scale neural modeling , 2010, Proceedings of 2010 IEEE International Symposium on Circuits and Systems.

[4]  Tara N. Sainath,et al.  Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups , 2012, IEEE Signal Processing Magazine.

[5]  Michael I. Jordan,et al.  A more biologically plausible learning rule for neural networks. , 1991, Proceedings of the National Academy of Sciences of the United States of America.

[6]  Jim D. Garside,et al.  SpiNNaker: A multi-core System-on-Chip for massively-parallel neural net simulation , 2012, Proceedings of the IEEE 2012 Custom Integrated Circuits Conference.

[7]  Colin J. Akerman,et al.  Random synaptic feedback weights support error backpropagation for deep learning , 2016, Nature Communications.

[8]  Andrew T. Sornborger,et al.  Graded, Dynamically Routable Information Processing with Synfire-Gated Synfire Chains , 2015, PLoS Comput. Biol..

[9]  David Zipser,et al.  The neurobiological significance of the new learning models , 1993 .

[10]  Andrew S. Cassidy,et al.  Convolutional networks for fast, energy-efficient neuromorphic computing , 2016, Proceedings of the National Academy of Sciences.

[11]  Yann Le Cun,et al.  A Theoretical Framework for Back-Propagation , 1988 .

[12]  Andrew T. Sornborger,et al.  A mechanism for graded, dynamically routable current propagation in pulse-gated synfire chains and implications for information coding , 2015, Journal of Computational Neuroscience.

[13]  Randall C. O'Reilly,et al.  Biologically Plausible Error-Driven Learning Using Local Activation Differences: The Generalized Recirculation Algorithm , 1996, Neural Computation.

[14]  Demis Hassabis,et al.  Mastering the game of Go with deep neural networks and tree search , 2016, Nature.

[15]  Ming Yang,et al.  DeepFace: Closing the Gap to Human-Level Performance in Face Verification , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[16]  Jiwei Zhang,et al.  Cusps enable line attractors for neural computation. , 2017, Physical review. E.

[17]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[18]  Tara N. Sainath,et al.  Deep Neural Networks for Acoustic Modeling in Speech Recognition , 2012 .

[19]  Andrew T. Sornborger,et al.  A Fokker-Planck approach to graded information propagation in pulse-gated feedforward neuronal networks , 2015, 1512.00520.

[20]  Joel Z. Leibo,et al.  How Important Is Weight Symmetry in Backpropagation? , 2015, AAAI.

[21]  Jonas Kubilius,et al.  Brain-Score: Which Artificial Neural Network for Object Recognition is most Brain-Like? , 2018, bioRxiv.

[22]  Andrew T. Sornborger,et al.  A pulse-gated, predictive neural circuit , 2016, 2016 50th Asilomar Conference on Signals, Systems and Computers.

[23]  Hong Wang,et al.  Loihi: A Neuromorphic Manycore Processor with On-Chip Learning , 2018, IEEE Micro.

[24]  Francis Crick,et al.  The recent excitement about neural networks , 1989, Nature.

[25]  Craig M. Vineyard,et al.  Training deep neural networks for binary communication with the Whetstone method , 2018, Nature Machine Intelligence.

[26]  Andrew T. Sornborger,et al.  Mutual Information and Information Gating in Synfire Chains , 2018, Entropy.

[27]  Geoffrey E. Hinton,et al.  Learning Representations by Recirculation , 1987, NIPS.

[28]  J. Feldman,et al.  Connectionist models and their implications: readings from cognitive science , 1988 .

[29]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[30]  Andrew T. Sornborger,et al.  A mechanism for synaptic copy between neural circuits , 2019, Neural Comput..