Using Noise to Compute Error Surfaces in Connectionist Networks: A Novel Means of Reducing Catastrophic Forgetting

In error-driven distributed feedforward networks, new information typically interferes, sometimes severely, with previously learned information. We show how noise can be used to approximate the error surface of previously learned information. By combining this approximated error surface with the error surface associated with the new information to be learned, the network's retention of previously learned items can be improved and catastrophic interference significantly reduced. Further, we show that the noise-generated error surface is produced using only first-derivative information and without recourse to any explicit error information.

[1]  S. Grossberg,et al.  Cortical Synchronization and Perceptual Framing , 1997, Journal of Cognitive Neuroscience.

[2]  L’oubli catastrophique it,et al.  Avoiding catastrophic forgetting by coupling two reverberating neural networks , 2004 .

[3]  Yann LeCun,et al.  Modeles connexionnistes de l'apprentissage , 1987 .

[4]  TWO-WEEK Loan COpy,et al.  University of California , 1886, The American journal of dental science.

[5]  Sergey M. Bezrukov,et al.  Noise-induced enhancement of signal transduction across voltage-dependent ion channels , 1995, Nature.

[6]  Yann LeCun,et al.  Improving the convergence of back-propagation learning with second-order methods , 1989 .

[7]  Anthony V. Robins,et al.  Catastrophic Forgetting, Rehearsal and Pseudorehearsal , 1995, Connect. Sci..

[8]  Carson C. Chow,et al.  Stochastic resonance without tuning , 1995, Nature.

[9]  Yann LeCun PhD thesis: Modeles connexionnistes de l'apprentissage (connectionist learning models) , 1987 .

[10]  Christopher M. Bishop,et al.  A Fast Procedure for Retraining the Multilayer Perceptron , 1991, Int. J. Neural Syst..

[11]  Robert M. French,et al.  Connectionist Models of Learning, Development and Evolution , 2001, Perspectives in Neural Computing.

[12]  Bernard Ans,et al.  Neural networks with a self-refreshing memory: Knowledge transfer in sequential learning tasks without catastrophic forgetting , 2000, Connect. Sci..

[13]  Robert M. French,et al.  Pseudopatterns and dual-network memory models: Advantages and shortcomings , 2000, NCPW.

[14]  R. French Catastrophic forgetting in connectionist networks , 1999, Trends in Cognitive Sciences.

[15]  Ralph Linsker,et al.  Self-organization in a perceptual network , 1988, Computer.

[16]  Robert M. French,et al.  Pseudo-recurrent Connectionist Networks: An Approach to the 'Sensitivity-Stability' Dilemma , 1997, Connect. Sci..

[17]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[18]  Jacques Sougné,et al.  Period doubling as a means of representing multiply instantiated entities , 1998 .

[19]  Y. L. Cun,et al.  Modèles connexionnistes de l'apprentissage , 1987 .

[20]  M. Gernsbacher,et al.  Proceedings of the 20th Annual Conference of the Cognitive Science Society , 1998 .

[21]  Frank Moss,et al.  Noise enhancement of information transfer in crayfish mechanoreceptors by stochastic resonance , 1993, Nature.