Synaptic noise as a means of implementing weight-perturbation learning

Weight-perturbation (WP) algorithms for supervised and/or reinforcement learning offer improved biological plausibility over backpropagation because of their reduced circuitry requirements for realization in neural hardware. This paper explores the hypothesis that biological synaptic noise might serve as the substrate by which weight perturbation is implemented. We explore the basic synaptic noise hypothesis (BSNH), which embodies the weakest assumptions about the underlying neural circuitry required to implement WP algorithms. This paper identifies relevant biological constraints consistent with the BSNH, taxonomizes existing WP algorithms with regard to consistency with those constraints, and proposes a new WP algorithm that is fully consistent with the constraints. By comparing the learning effectiveness of these algorithms via simulation studies, it is found that all of the algorithms can support traditional neural network learning tasks and have similar generalization characteristics, although the results suggest a trade-off between learning efficiency and biological accuracy.

[1]  B Sakmann,et al.  Quantal analysis of inhibitory synaptic transmission in the dentate gyrus of rat hippocampal slices: a patch‐clamp study. , 1990, The Journal of physiology.

[2]  P. Andersen,et al.  Putative Single Quantum and Single Fibre Excitatory Postsynaptic Currents Show Similar Amplitude Range and Variability in Rat Hippocampal Slices , 1992, The European journal of neuroscience.

[3]  G. Shepherd The Synaptic Organization of the Brain , 1979 .

[4]  T. Sejnowski,et al.  Reliability of spike timing in neocortical neurons. , 1995, Science.

[5]  Gert Cauwenberghs,et al.  A Fast Stochastic Error-Descent Algorithm for Supervised Learning and Optimization , 1992, NIPS.

[6]  Francis Crick,et al.  The recent excitement about neural networks , 1989, Nature.

[7]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[8]  Geoffrey E. Hinton,et al.  Learning Representations by Recirculation , 1987, NIPS.

[9]  Marwan A. Jabri,et al.  Weight perturbation: an optimal architecture and learning technique for analog VLSI feedforward and recurrent multilayer networks , 1992, IEEE Trans. Neural Networks.

[10]  H Korn,et al.  Probabilistic determination of synaptic strength. , 1986, Journal of neurophysiology.

[11]  O. Hikosaka Models of information processing in the basal Ganglia edited by James C. Houk, Joel L. Davis and David G. Beiser, The MIT Press, 1995. $60.00 (400 pp) ISBN 0 262 08234 9 , 1995, Trends in Neurosciences.

[12]  Vivien A. Casagrande,et al.  Biophysics of Computation: Information Processing in Single Neurons , 1999 .

[13]  Leonard K. Kaczmarek,et al.  The Neuron: Cell and Molecular Biology , 1991 .

[14]  Garrett E. Alexander Basal ganglia , 1998 .

[15]  Thomas Kailath,et al.  Model-free distributed learning , 1990, IEEE Trans. Neural Networks.

[16]  W. Newsome,et al.  The Variable Discharge of Cortical Neurons: Implications for Connectivity, Computation, and Information Coding , 1998, The Journal of Neuroscience.

[17]  Sebastian Thrun,et al.  The MONK''s Problems-A Performance Comparison of Different Learning Algorithms, CMU-CS-91-197, Sch , 1991 .

[18]  W. Singer,et al.  Long-term potentiation and NMDA receptors in rat visual cortex , 1987, Nature.

[19]  G Laurent,et al.  Single local interneurons in the locust make central synapses with different properties of transmitter release on distinct postsynaptic neurons , 1992, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[20]  Vijaykumar Gullapalli,et al.  A stochastic reinforcement learning algorithm for learning real-valued functions , 1990, Neural Networks.

[21]  Ron Meir,et al.  A Parallel Gradient Descent Method for Learning in Analog VLSI Neural Networks , 1992, NIPS.

[22]  N. Carlson Physiology of behavior, 5th ed. , 1994 .

[23]  Michael I. Jordan,et al.  A more biologically plausible learning rule for neural networks. , 1991, Proceedings of the National Academy of Sciences of the United States of America.

[24]  E. Marder,et al.  A Model Neuron with Activity-Dependent Conductances Regulated by Multiple Calcium Sensors , 1998, The Journal of Neuroscience.

[25]  S. Redman Quantal analysis of synaptic potentials in neurons of the central nervous system. , 1990, Physiological reviews.

[26]  Robert C. Malenka,et al.  Synaptic Plasticity in Hippocampus and Neocortex: A Comparison , 1995 .

[27]  C. Agner,et al.  The Neuron: Cell and Molecular Biology, 3rd Edition , 2002 .

[28]  Wolfgang Maass,et al.  A Model for Fast Analog Computation Based on Unreliable Synapses , 2000, Neural Computation.

[29]  Rajesh P. N. Rao,et al.  Spike-Timing-Dependent Hebbian Plasticity as Temporal Difference Learning , 2001, Neural Computation.

[30]  W. Precht The synaptic organization of the brain G.M. Shepherd, Oxford University Press (1975). 364 pp., £3.80 (paperback) , 1976, Neuroscience.

[31]  M. Gutnick,et al.  The Cortical Neuron , 1995 .

[32]  C. Stevens,et al.  Origin of variability in quantal size in cultured hippocampal neurons and hippocampal slices. , 1990, Proceedings of the National Academy of Sciences of the United States of America.

[33]  N. Carlson Physiology of behavior , 1977 .

[34]  G. Lynch,et al.  Intracellular injections of EGTA block induction of hippocampal long-term potentiation , 1983, Nature.

[35]  G. Collingridge,et al.  Excitatory amino acids in synaptic transmission in the Schaffer collateral‐commissural pathway of the rat hippocampus. , 1983, The Journal of physiology.

[36]  Terrence J. Sejnowski,et al.  The Computational Brain , 1996, Artif. Intell..

[37]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[38]  Christof Koch,et al.  Detecting and Estimating Signals over Noisy and Unreliable Synapses: Information-Theoretic Analysis , 2001, Neural Computation.

[39]  Paul J. Werbos,et al.  The Roots of Backpropagation: From Ordered Derivatives to Neural Networks and Political Forecasting , 1994 .

[40]  Joel L. Davis,et al.  A Model of How the Basal Ganglia Generate and Use Neural Signals That Predict Reinforcement , 1994 .

[41]  Randall C. O'Reilly,et al.  Biologically Plausible Error-Driven Learning Using Local Activation Differences: The Generalized Recirculation Algorithm , 1996, Neural Computation.

[42]  Marwan A. Jabri,et al.  Summed Weight Neuron Perturbation: An O(N) Improvement Over Weight Perturbation , 1992, NIPS.

[43]  Gert Cauwenberghs Adaptation, learning and storage in analog VLSI , 1996, Proceedings Ninth Annual IEEE International ASIC Conference and Exhibit.

[44]  M. Alexander,et al.  Principles of Neural Science , 1981 .

[45]  W. Schultz Predictive reward signal of dopamine neurons. , 1998, Journal of neurophysiology.

[46]  Kurt W. Fleischer,et al.  Analog VLSI Implementation of Multi-dimensional Gradient Descent , 1992, NIPS 1992.

[47]  Geoffrey E. Hinton,et al.  Learning representations by back-propagation errors, nature , 1986 .

[48]  Peter Redgrave,et al.  Basal Ganglia , 2020, Encyclopedia of Autism Spectrum Disorders.

[49]  K. P. Unnikrishnan,et al.  Alopex: A Correlation-Based Learning Algorithm for Feedforward and Recurrent Neural Networks , 1994, Neural Computation.

[50]  P. Werbos,et al.  Beyond Regression : "New Tools for Prediction and Analysis in the Behavioral Sciences , 1974 .

[51]  K. Stratford,et al.  Synaptic transmission between individual pyramidal neurons of the rat visual cortex in vitro , 1991, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[52]  Geoffrey E. Hinton Connectionist Learning Procedures , 1989, Artif. Intell..