Associative Memories via Predictive Coding

Associative memories in the brain receive and store patterns of activity registered by the sensory neurons, and are able to retrieve them when necessary. Due to their importance in human intelligence, computational models of associative memories have been developed for several decades now. In this paper, we present a novel neural model for realizing associative memories, which is based on a hierarchical generative network that receives external stimuli via sensory neurons. It is trained using predictive coding, an error-based learning algorithm inspired by information processing in the cortex. To test the model’s capabilities, we perform multiple retrieval experiments from both corrupted and incomplete data points. In an extensive comparison, we show that this new model outperforms in retrieval accuracy and robustness popular associative memory models, such as autoencoders trained via backpropagation, and modern Hopfield networks. In particular, in completing partial data points, our model achieves remarkable results on natural image datasets, such as ImageNet, with a surprisingly high accuracy, even when only a tiny fraction of pixels of the original images is presented. Our model provides a plausible framework to study learning and retrieval of memories in the brain, as it closely mimics the behavior of the hippocampus as a memory index and generative model.

[1]  Vitaly Feldman,et al.  What Neural Networks Memorize and Why: Discovering the Long Tail via Influence Estimation , 2020, NeurIPS.

[2]  L. Saksida,et al.  Visual perception and memory: a new view of medial temporal lobe function in primates and rodents. , 2007, Annual review of neuroscience.

[3]  Karl J. Friston,et al.  Repetition suppression and its contextual determinants in predictive coding , 2016, Cortex.

[4]  Geir Kjetil Sandve,et al.  Hopfield Networks is All You Need , 2020, ArXiv.

[5]  Beren Millidge,et al.  Predictive Coding Approximates Backprop Along Arbitrary Computation Graphs , 2020, Neural Computation.

[6]  Karl J. Friston,et al.  Cerebral hierarchies: predictive processing, precision and the pulvinar , 2015, Philosophical Transactions of the Royal Society B: Biological Sciences.

[7]  John J. Hopfield,et al.  Dense Associative Memory for Pattern Recognition , 2016, NIPS.

[8]  Matthias Löwe,et al.  On a Model of Associative Memory with Huge Storage Capacity , 2017, 1702.01929.

[9]  Thomas Lukasiewicz,et al.  Can the Brain Do Backpropagation? - Exact Implementation of Backpropagation in Predictive Coding Networks , 2020, NeurIPS.

[10]  Karl J. Friston,et al.  Predictive coding explains binocular rivalry: An epistemological review , 2008, Cognition.

[11]  Gang Yang,et al.  Associative memory optimized method on deep neural networks for image classification , 2020, Inf. Sci..

[12]  Philipp Sterzer,et al.  A predictive coding account of bistable perception - a model-based fMRI study , 2017, PLoS Comput. Biol..

[13]  Karl J. Friston,et al.  Attention, Uncertainty, and Free-Energy , 2010, Front. Hum. Neurosci..

[14]  Geir Kjetil Sandve,et al.  Modern Hopfield Networks and Attention for Immune Repertoire Classification , 2020, bioRxiv.

[15]  James L. McClelland,et al.  Hippocampal conjunctive encoding, storage, and recall: Avoiding a trade‐off , 1994, Hippocampus.

[16]  P. Golshani,et al.  Direct Reactivation of a Coherent Neocortical Memory of Context , 2014, Neuron.

[17]  Rajesh P. N. Rao,et al.  Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects. , 1999 .

[18]  Ronald,et al.  Learning representations by backpropagating errors , 2004 .

[19]  D. J. Felleman,et al.  Distributed hierarchical processing in the primate cerebral cortex. , 1991, Cerebral cortex.

[20]  Rafal Bogacz,et al.  A tutorial on the free-energy framework for modelling perception and learning , 2017, Journal of mathematical psychology.

[21]  Rafal Bogacz,et al.  An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity , 2017, Neural Computation.

[22]  James C. R. Whittington,et al.  Theories of Error Back-Propagation in the Brain , 2019, Trends in Cognitive Sciences.

[23]  Gabriel Kreiman,et al.  Deep Predictive Coding Networks for Video Prediction and Unsupervised Learning , 2016, ICLR.

[24]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[25]  Mikhail Belkin,et al.  Overparameterized neural networks implement associative memory , 2019, Proceedings of the National Academy of Sciences.

[26]  Karl J. Friston,et al.  Prediction and memory: A predictive coding account , 2020, Progress in neurobiology.

[27]  M. Botvinick,et al.  The hippocampus as a predictive map , 2016 .

[28]  Pedro M. Domingos,et al.  Every Model Learned by Gradient Descent Is Approximately a Kernel Machine , 2020, ArXiv.

[29]  Brian J. Wiltgen,et al.  Cortical Representations Are Reinstated by the Hippocampus during Memory Retrieval , 2014, Neuron.

[30]  Karl J. Friston,et al.  A theory of cortical responses , 2005, Philosophical Transactions of the Royal Society B: Biological Sciences.

[31]  Daniel Kifer,et al.  The neural coding framework for learning generative models , 2020, Nature Communications.

[32]  A. Kitaoka,et al.  Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction , 2018, Front. Psychol..

[33]  Karl Steinbuch,et al.  Die Lernmatrix , 2004, Kybernetik.

[34]  Haibo He,et al.  Deep associative neural network for associative memory based on unsupervised representation learning , 2019, Neural Networks.

[35]  Thomas Lukasiewicz,et al.  Predictive Coding Can Do Exact Backpropagation on Any Neural Network , 2021, ArXiv.

[36]  Dmitry Krotov,et al.  Large Associative Memory Problem in Neurobiology and Machine Learning , 2020, ArXiv.

[37]  Svetlana Lazebnik,et al.  Flickr30k Entities: Collecting Region-to-Phrase Correspondences for Richer Image-to-Sentence Models , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[38]  James L. McClelland,et al.  Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. , 1995, Psychological review.

[39]  L. Squire,et al.  The medial temporal lobe memory system , 1991, Science.

[40]  Karl J. Friston Learning and inference in the brain , 2003, Neural Networks.

[41]  Kefei Liu,et al.  Generative Predictive Codes by Multiplexed Hippocampal Neuronal Tuplets , 2018, Neuron.

[42]  Simon Osindero,et al.  Meta-Learning Deep Energy-Based Memory Models , 2020, ICLR.

[43]  John J. Hopfield,et al.  Unsupervised learning by competing hidden units , 2018, Proceedings of the National Academy of Sciences.

[44]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[45]  T. Bliss,et al.  Long‐lasting potentiation of synaptic transmission in the dentate area of the anaesthetized rabbit following stimulation of the perforant path , 1973, The Journal of physiology.

[46]  Thomas Lukasiewicz,et al.  Predictive Coding Can Do Exact Backpropagation on Convolutional and Recurrent Neural Networks , 2021, ArXiv.

[47]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.