Example Based Hebbian Learning may be sufficient to support Human Intelligence

In this hypothesis paper we argue that when driven by example behavior, a simple Hebbian learning mechanism can form the core of a computational theory of learning that can support both low level learning and the development of human level intelligence. We show that when driven by example behavior Hebbian learning rules can support semantic, episodic and procedural memory. For humans, we hypothesize that the abilities to manipulate an off-line world model and to abstract using language allow for the generation and communication of rich example behavior, and thereby support human learning and a gradual increase of collective human intelligence across generations. We also compare the properties of Example Based Hebbian (EBH) learning with those of backpropagation-based learning and argue that the EBH mechanism is more consistent with observed characteristics of human learning.

[1]  Daniel C. Krawczyk The cognition and neuroscience of relational reasoning , 2012, Brain Research.

[2]  Peter Dayan,et al.  A Neural Substrate of Prediction and Reward , 1997, Science.

[3]  D. Gentner,et al.  Reasoning and learning by analogy. , 1997, The American psychologist.

[4]  Javier R. Movellan,et al.  Contrastive Hebbian Learning in the Continuous Hopfield Model , 1991 .

[5]  Timothy P. Lillicrap,et al.  Author response: Towards deep learning with segregated dendrites , 2017 .

[6]  L. Abbott,et al.  Cortical Development and Remapping through Spike Timing-Dependent Plasticity , 2001, Neuron.

[7]  Boris B. Velichkovsky,et al.  Consciousness and working memory: Current trends and research perspectives , 2017, Consciousness and Cognition.

[8]  Yoshua Bengio,et al.  Towards Biologically Plausible Deep Learning , 2015, ArXiv.

[9]  Joshua B. Tenenbaum,et al.  Building machines that learn and think like people , 2016, Behavioral and Brain Sciences.

[10]  C. Koch,et al.  Neural correlates of consciousness: progress and problems , 2016, Nature Reviews Neuroscience.

[11]  Nicolas Y. Masse,et al.  Circuit mechanisms for the maintenance and manipulation of information in working memory , 2018 .

[12]  Walter Schneider,et al.  Controlled and Automatic Human Information Processing: 1. Detection, Search, and Attention. , 1977 .

[13]  Xiaohui Xie,et al.  Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network , 2003, Neural Computation.

[14]  H. Eichenbaum Time cells in the hippocampus: a new dimension for mapping memories , 2014, Nature Reviews Neuroscience.

[15]  Patricia A. Ganea,et al.  Thinking of Things Unseen , 2007, Psychological science.

[16]  Timothy P Lillicrap,et al.  Towards deep learning with segregated dendrites , 2016, eLife.

[17]  T. Ebner,et al.  Functional magnetic resonance imaging of cerebellar activation during the learning of a visuomotor dissociation task , 1996, Human brain mapping.

[18]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[19]  James A. Anderson,et al.  Cognitive and psychological computation with neural models , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[20]  Karl J. Friston The free-energy principle: a unified brain theory? , 2010, Nature Reviews Neuroscience.

[21]  Saba Ayman-Nolley Vygotsky's perspective on the development of imagination and creativity , 1992 .

[22]  Colin J. Akerman,et al.  Random synaptic feedback weights support error backpropagation for deep learning , 2016, Nature Communications.

[23]  Konrad Paul Kording,et al.  Learning to solve the credit assignment problem , 2019, ICLR.

[24]  Takashi Kitamura,et al.  The role of engram cells in the systems consolidation of memory , 2018, Nature Reviews Neuroscience.

[25]  A. Vyshedskiy Neuroscience of imagination and implications for human evolution , 2019 .

[26]  Travis Bartley,et al.  Contrastive Hebbian Learning with Random Feedback Weights , 2018, Neural Networks.

[27]  R. Segev,et al.  How silent is the brain: is there a “dark matter” problem in neuroscience? , 2006, Journal of Comparative Physiology A.

[28]  G. Deco,et al.  Ongoing Cortical Activity at Rest: Criticality, Multistability, and Ghost Attractors , 2012, The Journal of Neuroscience.

[29]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[30]  R. Passingham,et al.  The Time Course of Changes during Motor Sequence Learning: A Whole-Brain fMRI Study , 1998, NeuroImage.

[31]  Jason Weston,et al.  Curriculum learning , 2009, ICML '09.

[32]  H Sompolinsky,et al.  Associative neural network model for the generation of temporal patterns. Theory and application to central pattern generators. , 1988, Biophysical journal.

[33]  A. Lansner,et al.  A Spiking Working Memory Model Based on Hebbian Short-Term Potentiation , 2017, The Journal of Neuroscience.

[34]  A. Borst Seeing smells: imaging olfactory learning in bees , 1999, Nature Neuroscience.

[35]  H. Eichenbaum,et al.  Hippocampal “Time Cells” Bridge the Gap in Memory for Discontiguous Events , 2011, Neuron.

[36]  S. Dehaene,et al.  Towards a cognitive neuroscience of consciousness: basic evidence and a workspace framework , 2001, Cognition.

[37]  Alison L. Barth,et al.  Experimental evidence for sparse firing in the neocortex , 2012, Trends in Neurosciences.

[38]  M M Merzenich,et al.  Temporal information transformed into a spatial code by a neural network with realistic properties , 1995, Science.

[39]  Richard G. Baraniuk,et al.  Sparse Coding via Thresholding and Local Competition in Neural Circuits , 2008, Neural Computation.

[40]  G. Tesauro,et al.  Simple neural models of classical conditioning , 1986, Biological Cybernetics.

[41]  Leonardo L. Gollo,et al.  Criticality in the brain: A synthesis of neurobiology, models and cognition , 2017, Progress in Neurobiology.

[42]  R. Sutton,et al.  Simulation of anticipatory responses in classical conditioning by a neuron-like adaptive element , 1982, Behavioural Brain Research.

[43]  K. Deisseroth,et al.  Optogenetic stimulation of a hippocampal engram activates fear memory recall , 2012, Nature.

[44]  John Kaag,et al.  The Evolution of the Imagination , 2014 .

[45]  Demis Hassabis,et al.  A general reinforcement learning algorithm that masters chess, shogi, and Go through self-play , 2018, Science.

[46]  T. Toyoizumi,et al.  Learning with three factors: modulating Hebbian plasticity with errors , 2017, Current Opinion in Neurobiology.

[47]  R. Shiffrin,et al.  Controlled and automatic human information processing: I , 1977 .

[48]  K. Svoboda,et al.  Neural mechanisms of movement planning: motor cortex and beyond , 2018, Current Opinion in Neurobiology.

[49]  Timothy P Lillicrap,et al.  Dendritic solutions to the credit assignment problem , 2019, Current Opinion in Neurobiology.

[50]  Shane Legg,et al.  Human-level control through deep reinforcement learning , 2015, Nature.

[51]  David C Rowland,et al.  Place cells, grid cells, and memory. , 2015, Cold Spring Harbor perspectives in biology.

[52]  R. Muller,et al.  A Quarter of a Century of Place Cells , 1996, Neuron.

[53]  J. Knott The organization of behavior: A neuropsychological theory , 1951 .

[54]  C. Koch,et al.  Integrated information theory: from consciousness to its physical substrate , 2016, Nature Reviews Neuroscience.

[55]  David M. Smith,et al.  Hippocampal place cells, context, and episodic memory , 2006, Hippocampus.

[56]  L. Gleitman,et al.  [Language and thought]. , 1991, La Revue du praticien.

[57]  Joshua Shepherd Conscious Control over Action , 2015, Mind & language.

[58]  J. McGrath,et al.  A Systematic Review of the Prevalence of Schizophrenia , 2005, PLoS medicine.

[59]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[60]  Upamanyu Madhow,et al.  Learning Sparse, Distributed Representations using the Hebbian Principle , 2016, ArXiv.

[61]  Xiao-Jing Wang,et al.  Task representations in neural networks trained to perform many cognitive tasks , 2019, Nature Neuroscience.

[62]  U. Halsband,et al.  Motor learning in man: A review of functional and clinical studies , 2006, Journal of Physiology-Paris.

[63]  Marc W Howard,et al.  Time Cells in Hippocampal Area CA3 , 2016, The Journal of Neuroscience.

[64]  Wulfram Gerstner,et al.  Localized random projections challenge benchmarks for bio-plausible deep learning , 2018 .

[65]  Catherine Tallon-Baudry,et al.  On the Neural Mechanisms Subserving Consciousness and Attention , 2012, Front. Psychology.

[66]  E. Oja Simplified neuron model as a principal component analyzer , 1982, Journal of mathematical biology.

[67]  S. Josselyn Continuing the search for the engram: examining the mechanism of fear memories. , 2010, Journal of psychiatry & neuroscience : JPN.

[68]  R. Levy,et al.  General and specialized brain correlates for analogical reasoning: A meta‐analysis of functional imaging studies , 2016, Human brain mapping.

[69]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).