Relaxing the Constraints on Predictive Coding Models

Predictive coding is an influential theory of cortical function which posits that the principal computation the brain performs, which underlies both perception and learning, is the minimization of prediction errors. While motivated by high-level notions of variational inference, detailed neurophysiological models of cortical microcircuits which can implements its computations have been developed. Moreover, under certain conditions, predictive coding has been shown to approximate the backpropagation of error algorithm, and thus provides a relatively biologically plausible credit-assignment mechanism for training deep networks. However, standard implementations of the algorithm still involve potentially neurally implausible features such as identical forward and backward weights, backward nonlinear derivatives, and 1-1 error unit connectivity. In this paper, we show that these features are not integral to the algorithm and can be removed either directly or through learning additional sets of parameters with Hebbian update rules without noticeable harm to learning performance. Our work thus relaxes current constraints on potential microcircuit designs and hopefully opens up new regions of the design-space for neuromorphic implementations of predictive coding.

[1]  Rafal Bogacz,et al.  A tutorial on the free-energy framework for modelling perception and learning , 2017, Journal of mathematical psychology.

[2]  Arild Nøkland,et al.  Direct Feedback Alignment Provides Learning in Deep Neural Networks , 2016, NIPS.

[3]  C. Stevens,et al.  Structural uniformity of neocortex, revisited , 2013, Proceedings of the National Academy of Sciences.

[4]  Francis Crick,et al.  The recent excitement about neural networks , 1989, Nature.

[5]  D. Knill,et al.  The Bayesian brain: the role of uncertainty in neural coding and computation , 2004, Trends in Neurosciences.

[6]  Karl J. Friston The free-energy principle: a unified brain theory? , 2010, Nature Reviews Neuroscience.

[7]  Karl J. Friston,et al.  Reflections on agranular architecture: predictive coding in the motor cortex , 2013, Trends in Neurosciences.

[8]  Rajesh P. N. Rao,et al.  Bayesian brain : probabilistic approaches to neural coding , 2006 .

[9]  J. Makhoul,et al.  Linear prediction: A tutorial review , 1975, Proceedings of the IEEE.

[10]  Karl J. Friston,et al.  A theory of cortical responses , 2005, Philosophical Transactions of the Royal Society B: Biological Sciences.

[11]  Joel Z. Leibo,et al.  How Important Is Weight Symmetry in Backpropagation? , 2015, AAAI.

[12]  James C. R. Whittington,et al.  Theories of Error Back-Propagation in the Brain , 2019, Trends in Cognitive Sciences.

[13]  Surya Ganguli,et al.  A deep learning framework for neuroscience , 2019, Nature Neuroscience.

[14]  Stewart Shipp,et al.  Neural Elements for Predictive Coding , 2016, Front. Psychol..

[15]  T. Powell,et al.  The basic uniformity in structure of the neocortex. , 1980, Brain : a journal of neurology.

[16]  Michael W. Spratling Reconciling Predictive Coding and Biased Competition Models of Cortical Function , 2008, Frontiers Comput. Neurosci..

[17]  Karl J. Friston,et al.  Canonical Microcircuits for Predictive Coding , 2012, Neuron.

[18]  D. Hassabis,et al.  Neuroscience-Inspired Artificial Intelligence , 2017, Neuron.

[19]  J. Kaas,et al.  The basic nonuniformity of the cerebral cortex , 2008, Proceedings of the National Academy of Sciences.

[20]  Karl J. Friston A free energy principle for a particular physics , 2019, 1906.10184.

[21]  Pierre Baldi,et al.  A theory of local learning, the learning channel, and the optimality of backpropagation , 2015, Neural Networks.

[22]  A. Clark Whatever next? Predictive brains, situated agents, and the future of cognitive science. , 2013, The Behavioral and brain sciences.

[23]  Georg B. Keller,et al.  Predictive Processing: A Canonical Cortical Computation , 2018, Neuron.

[24]  Karl J. Friston,et al.  A free energy principle for the brain , 2006, Journal of Physiology-Paris.

[25]  Paul J. Werbos,et al.  Applications of advances in nonlinear sensitivity analysis , 1982 .

[26]  Colin J. Akerman,et al.  Random synaptic feedback weights support error backpropagation for deep learning , 2016, Nature Communications.

[27]  Yoshua Bengio,et al.  Generalization of Equilibrium Propagation to Vector Field Dynamics , 2018, ArXiv.

[28]  Dileep George,et al.  Towards a Mathematical Theory of Cortical Micro-circuits , 2009, PLoS Comput. Biol..

[29]  P. Rakic Confusing cortical columns , 2008, Proceedings of the National Academy of Sciences.

[30]  Alexander Ororbia,et al.  Biologically Motivated Algorithms for Propagating Local Target Representations , 2018, AAAI.

[31]  S. Laughlin,et al.  Predictive coding: a fresh view of inhibition in the retina , 1982, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[32]  Michael W. Spratling Predictive Coding as a Model of Response Properties in Cortical Area V1 , 2010, The Journal of Neuroscience.

[33]  Beren Millidge,et al.  Predictive Coding Approximates Backprop Along Arbitrary Computation Graphs , 2020, Neural Computation.

[34]  Karl J. Friston Learning and inference in the brain , 2003, Neural Networks.

[35]  Simon McGregor,et al.  The free energy principle for action and perception: A mathematical review , 2017, 1705.09156.

[36]  Tutut Herawan,et al.  Computational and mathematical methods in medicine. , 2006, Computational and mathematical methods in medicine.

[37]  Jessica Koehler,et al.  Advanced Digital Signal Processing And Noise Reduction , 2016 .

[38]  Rajesh P. N. Rao,et al.  Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects. , 1999 .

[39]  Yali Amit,et al.  Deep Learning With Asymmetric Connections and Hebbian Updates , 2018, Front. Comput. Neurosci..

[40]  M. Nour Surfing Uncertainty: Prediction, Action, and the Embodied Mind. , 2017, British Journal of Psychiatry.

[41]  Karl J. Friston,et al.  Cerebral hierarchies: predictive processing, precision and the pulvinar , 2015, Philosophical Transactions of the Royal Society B: Biological Sciences.

[42]  Peter C. Humphreys,et al.  Deep Learning without Weight Transport , 2019, NeurIPS.

[43]  Konrad P. Körding,et al.  Toward an Integration of Deep Learning and Neuroscience , 2016, bioRxiv.

[44]  Daniel Cownden,et al.  Random feedback weights support learning in deep neural networks , 2014, ArXiv.

[45]  Timothy P Lillicrap,et al.  Towards deep learning with segregated dendrites , 2016, eLife.

[46]  Rafal Bogacz,et al.  An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity , 2017, Neural Computation.

[47]  Yoshua Bengio,et al.  STDP-Compatible Approximation of Backpropagation in an Energy-Based Model , 2017, Neural Computation.

[48]  Karl J. Friston Hierarchical Models in the Brain , 2008, PLoS Comput. Biol..

[49]  A. Pouget,et al.  Probabilistic brains: knowns and unknowns , 2013, Nature Neuroscience.

[50]  David Zipser,et al.  Feature Discovery by Competive Learning , 1986, Cogn. Sci..

[51]  Yoshua Bengio,et al.  Dendritic cortical microcircuits approximate the backpropagation algorithm , 2018, NeurIPS.

[52]  David P. McGovern,et al.  Evaluating the neurophysiological evidence for predictive processing as a model of perception , 2020, Annals of the New York Academy of Sciences.

[53]  Rafal Bogacz,et al.  An approximation of the error back-propagation algorithm in a predictive coding network with local Hebbian synaptic plasticity , 2015, bioRxiv.