Preventing Catastrophic Interference in Multiple-Sequence Learning Using Coupled Reverberating Elman Networks

Everyone agrees that real cognition requires much more than static pattern recognition. In particular, it requires the ability to learn sequences of patterns (or actions) But learning sequences really means being able to learn multiple sequences, one after the other, wi thout the most recently learned ones erasing the previously learned ones. But if catastrophic interference is a problem for the sequential learning of individual patterns, the problem is amplified many times over when multiple sequences of patterns have to be learned consecutively, because each new sequence consists of many linked patterns. In this paper we will present a connectionist architecture that would seem to solve the problem of multiple sequence learning using pseudopatterns.

[1]  Michael McCloskey,et al.  Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem , 1989 .

[2]  Geoffrey E. Hinton 20 – CONNECTIONIST LEARNING PROCEDURES1 , 1990 .

[3]  R Ratcliff,et al.  Connectionist models of recognition memory: constraints imposed by learning and forgetting functions. , 1990, Psychological review.

[4]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[5]  Robert M. French,et al.  Semi-distributed Representations and Catastrophic Forgetting in Connectionist Networks , 1992 .

[6]  Andrew S. Noetzel,et al.  Forced Simple Recurrent Neural Networks and Grammatical Inference , 1992 .

[7]  Andrew S. Noetzel,et al.  Sequence Recognition with Recurrent Neural Networks , 1993 .

[8]  James L. McClelland,et al.  Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. , 1995, Psychological review.

[9]  Anthony V. Robins,et al.  Catastrophic Forgetting, Rehearsal and Pseudorehearsal , 1995, Connect. Sci..

[10]  Noel E. Sharkey,et al.  An Analysis of Catastrophic Interference , 1995, Connect. Sci..

[11]  James L. McClelland,et al.  Understanding normal and impaired word reading: computational principles in quasi-regular domains. , 1996, Psychological review.

[12]  Arnaud Destrebecqz,et al.  Incremental sequence learning , 1996 .

[13]  Bernard Ans,et al.  Avoiding catastrophic forgetting by coupling two reverberating neural networks , 1997 .

[14]  Robert M. French,et al.  Pseudo-recurrent Connectionist Networks: An Approach to the 'Sensitivity-Stability' Dilemma , 1997, Connect. Sci..

[15]  R. French Catastrophic forgetting in connectionist networks , 1999, Trends in Cognitive Sciences.

[16]  Bernard Ans,et al.  Neural networks with a self-refreshing memory: Knowledge transfer in sequential learning tasks without catastrophic forgetting , 2000, Connect. Sci..