A Growing Long-term Episodic & Semantic Memory

The long-term memory of most connectionist systems lies entirely in the weights of the system. Since the number of weights is typically fixed, this bounds the total amount of knowledge that can be learned and stored. Though this is not normally a problem for a neural network designed for a specific task, such a bound is undesirable for a system that continually learns over an open range of domains. To address this, we describe a lifelong learning system that leverages a fast, though non-differentiable, content-addressable memory which can be exploited to encode both a long history of sequential episodic knowledge and semantic knowledge over many episodes for an unbounded number of domains. This opens the door for investigation into transfer learning, and leveraging prior knowledge that has been learned over a lifetime of experiences to new domains.

[1]  Razvan Pascanu,et al.  Progressive Neural Networks , 2016, ArXiv.

[2]  Quoc V. Le,et al.  HyperNetworks , 2016, ICLR.

[3]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[4]  Pentti Kanerva,et al.  Sparse Distributed Memory , 1988 .

[5]  R. Sun Memory systems within a cognitive architecture , 2012 .

[6]  Mehryar Mohri,et al.  AdaNet: Adaptive Structural Learning of Artificial Neural Networks , 2016, ICML.

[7]  Jason Weston,et al.  Memory Networks , 2014, ICLR.

[8]  Joel Z. Leibo,et al.  Model-Free Episodic Control , 2016, ArXiv.

[9]  D. Drachman Do we have brain to spare? , 2005, Neurology.

[10]  Christian Lebiere,et al.  The Cascade-Correlation Learning Architecture , 1989, NIPS.

[11]  Heng Tao Shen,et al.  Hashing for Similarity Search: A Survey , 2014, ArXiv.

[12]  Alex Graves,et al.  Neural Turing Machines , 2014, ArXiv.

[13]  David W. Aha,et al.  Instance-Based Learning Algorithms , 1991, Machine Learning.

[14]  T. Sejnowski,et al.  Nanoconnectomic upper bound on the variability of synaptic plasticity , 2015, eLife.

[15]  Nicola S. Clayton,et al.  Episodic Memory , 2019, Encyclopedia of Animal Cognition and Behavior.

[16]  Tianqi Chen,et al.  Net2Net: Accelerating Learning via Knowledge Transfer , 2015, ICLR.

[17]  D. Kibler,et al.  Instance-based learning algorithms , 2004, Machine Learning.

[18]  Terrence C. Stewart,et al.  A biologically realistic cleanup memory: Autoassociation in spiking neurons , 2011, Cognitive Systems Research.

[19]  Honglak Lee,et al.  Control of Memory, Active Perception, and Action in Minecraft , 2016, ICML.

[20]  John E. Laird,et al.  Efficiently Implementing Episodic Memory , 2009, ICCBR.

[21]  Michael D. Howard,et al.  Complementary Learning Systems , 2014, Cogn. Sci..

[22]  Alex Graves,et al.  DRAW: A Recurrent Neural Network For Image Generation , 2015, ICML.

[23]  Richard Socher,et al.  Ask Me Anything: Dynamic Memory Networks for Natural Language Processing , 2015, ICML.

[24]  Luca Bertinetto,et al.  Learning feed-forward one-shot learners , 2016, NIPS.

[25]  Jürgen Schmidhuber,et al.  On Learning to Think: Algorithmic Information Theory for Novel Combinations of Reinforcement Learning Controllers and Recurrent Neural World Models , 2015, ArXiv.

[26]  Bernd Fritzke,et al.  A Growing Neural Gas Network Learns Topologies , 1994, NIPS.

[27]  Philip H. Ramsey Nonparametric Statistical Methods , 1974, Technometrics.

[28]  Yoshua Bengio,et al.  Hierarchical Multiscale Recurrent Neural Networks , 2016, ICLR.

[29]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[30]  Sergio Gomez Colmenarejo,et al.  Hybrid computing using a neural network with dynamic external memory , 2016, Nature.

[31]  Derek Hoiem,et al.  Learning without Forgetting , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[32]  B. Pakkenberg,et al.  Neocortical neuron number in humans: Effect of sex and age , 1997, The Journal of comparative neurology.

[33]  David Gilmore,et al.  Modeling Order in Neural Word Embeddings at Scale , 2015, ICML.