Incremental learning of complex temporal patterns

A neural model for temporal pattern generation is used and analyzed for training with multiple complex sequences in a sequential manner. The network exhibits some degree of interference when new sequences are acquired. It is proven that the model is capable of incrementally learning a finite number of complex sequences. The model is then evaluated with a large set of highly correlated sequences. While the number of intact sequences increases linearly with the number of previously acquired sequences, the amount of retraining due to interference appears to be independent of the size of existing memory. The model is extended to include a chunking network which detects repeated subsequences between and within sequences. The chunking mechanism substantially reduces the amount of retraining in sequential training. Thus, the network investigated here constitutes an effective sequential memory. Various aspects of such a memory are discussed.

[1]  K. Lashley The problem of serial order in behavior , 1951 .

[2]  G. A. Miller THE PSYCHOLOGICAL REVIEW THE MAGICAL NUMBER SEVEN, PLUS OR MINUS TWO: SOME LIMITS ON OUR CAPACITY FOR PROCESSING INFORMATION 1 , 1956 .

[3]  B. Underwood,et al.  Fate of first-list associations in transfer theory. , 1959, Journal of experimental psychology.

[4]  S. Grossberg Some Networks That Can Learn, Remember, and Reproduce any Number of Complicated Space-Time Patterns, I , 1969 .

[5]  H. C. LONGUET-HIGGINS,et al.  Non-Holographic Associative Memory , 1969, Nature.

[6]  H A Simon,et al.  How Big Is a Chunk? , 1974, Science.

[7]  Kanter,et al.  Temporal association in asymmetric neural networks. , 1986, Physical review letters.

[8]  Stephen Grossberg,et al.  Competitive Learning: From Interactive Activation to Adaptive Resonance , 1987, Cogn. Sci..

[9]  Stephen Grossberg,et al.  A massively parallel architecture for a self-organizing neural pattern recognition machine , 1988, Comput. Vis. Graph. Image Process..

[10]  Opper,et al.  Learning of correlated patterns in spin-glass networks by local learning rules. , 1987, Physical review letters.

[11]  Joachim M. Buhmann,et al.  Noise-driven temporal association in neural networks , 1987 .

[12]  Kanter,et al.  Associative recall of memory without errors. , 1987, Physical review. A, General physics.

[13]  Personnaz,et al.  Storage and retrieval of complex sequences in neural networks. , 1988, Physical review. A, General physics.

[14]  Pentti Kanerva,et al.  Sparse Distributed Memory , 1988 .

[15]  Richard Lippmann,et al.  Review of Neural Networks for Speech Recognition , 1989, Neural Computation.

[16]  Ramez Elmasri,et al.  Fundamentals of Database Systems , 1989 .

[17]  Michael McCloskey,et al.  Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem , 1989 .

[18]  Michael A. Arbib,et al.  Complex temporal sequence learning based on short-term memory , 1990 .

[19]  R Ratcliff,et al.  Connectionist models of recognition memory: constraints imposed by learning and forgetting functions. , 1990, Psychological review.

[20]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[21]  C. L. Brewer Psychology: Science, Behavior and Life. , 1990 .

[22]  Michael I. Jordan Attractor dynamics and parallelism in a connectionist sequential machine , 1990 .

[23]  Robert M. French,et al.  Using Semi-Distributed Representations to Overcome Catastrophic Forgetting in Connectionist Networks , 1991 .

[24]  Anders Krogh,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[25]  Tom Heskes,et al.  Retrieval of pattern sequences at variable speeds in a neural network with delays , 1992, Neural Networks.

[26]  J. Kruschke,et al.  ALCOVE: an exemplar-based connectionist model of category learning. , 1992, Psychological review.

[27]  W. A. Rodriguez,et al.  Attenuation by contextual cues of retroactive interference of a conditional discrimination in rats , 1993 .

[28]  Michael A. Arbib,et al.  Timing and chunking in processing temporal order , 1993, IEEE Trans. Syst. Man Cybern..

[29]  C. C. Chandler Accessing related events increases retroactive interference in a matching recognition test. , 1993 .

[30]  G. Bower,et al.  Reducing retroactive interference: an interference analysis. , 1994, Journal of experimental psychology. Learning, memory, and cognition.

[31]  R Granger,et al.  Non-Hebbian properties of long-term potentiation enable high-capacity encoding of temporal sequences. , 1994, Proceedings of the National Academy of Sciences of the United States of America.

[32]  Yoram Baram Memorizing binary vector sequences by a sparsely encoded network , 1994, IEEE Trans. Neural Networks.

[33]  Yoshua Bengio,et al.  Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.

[34]  DeLiang Wang,et al.  Anticipation-based temporal pattern generation , 1995, IEEE Trans. Syst. Man Cybern..

[35]  Carme Torras,et al.  On-line learning with minimal degradation in feedforward networks , 1995, IEEE Trans. Neural Networks.

[36]  G. Rinkus TEMECOR: An Associative, Spatio-temporal Pattern Memory for Complex State Sequences , 1995 .

[37]  DeLiang Wang,et al.  Temporal pattern processing , 1998 .

[38]  R. French Dynamically constraining connectionist networks to produce distributed, orthogonal representations to reduce catastrophic interference , 2019, Proceedings of the Sixteenth Annual Conference of the Cognitive Science Society.