Localizing Catastrophic Forgetting in Neural Networks

Artificial neural networks (ANNs) suffer from catastrophic forgetting when trained on a sequence of tasks. While this phenomenon was studied in the past, there is only very limited recent research on this phenomenon. We propose a method for determining the contribution of individual parameters in an ANN to catastrophic forgetting. The method is used to analyze an ANNs response to three different continual learning scenarios.

[1]  Yarin Gal,et al.  Towards Robust Evaluations of Continual Learning , 2018, ArXiv.

[2]  R Ratcliff,et al.  Connectionist models of recognition memory: constraints imposed by learning and forgetting functions. , 1990, Psychological review.

[3]  Surya Ganguli,et al.  Continual Learning Through Synaptic Intelligence , 2017, ICML.

[4]  Yen-Cheng Liu,et al.  Re-evaluating Continual Learning Scenarios: A Categorization and Case for Strong Baselines , 2018, ArXiv.

[5]  Richard E. Turner,et al.  Variational Continual Learning , 2017, ICLR.

[6]  Makoto Yamaguchi Reassessment of catastrophic interference , 2004, Neuroreport.

[7]  Razvan Pascanu,et al.  Overcoming catastrophic forgetting in neural networks , 2016, Proceedings of the National Academy of Sciences.

[8]  Anthony V. Robins,et al.  Consolidation in Neural Networks and in the Sleeping Brain , 1996, Connect. Sci..

[9]  Michael McCloskey,et al.  Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem , 1989 .

[10]  R. French Catastrophic Forgetting in Connectionist Networks , 2006 .

[11]  K. McRae,et al.  Catastrophic Interference is Eliminated in Pretrained Networks , 1993 .

[12]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[13]  Jiwon Kim,et al.  Continual Learning with Deep Generative Replay , 2017, NIPS.

[14]  Andreas S. Tolias,et al.  Generative replay with feedback connections as a general strategy for continual learning , 2018, ArXiv.

[15]  Yoshua Bengio,et al.  An Empirical Investigation of Catastrophic Forgeting in Gradient-Based Neural Networks , 2013, ICLR.

[16]  Roland Vollgraf,et al.  Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms , 2017, ArXiv.