Recursive self-organizing maps

This paper explores the combination of self-organizing map (SOM) and feedback, in order to represent sequences of inputs. In general, neural networks with time-delayed feedback represent time implicitly, by combining current inputs and past activities. It has been difficult to apply this approach to SOM, because feedback generates instability during learning. We demonstrate a solution to this problem, based on a nonlinearity. The result is a generalization of SOM that learns to represent sequences recursively. We demonstrate that the resulting representations are adapted to the temporal statistics of the input series.

[1]  John G. Taylor,et al.  The temporal Kohönen map , 1993, Neural Networks.

[2]  Geoffrey E. Hinton,et al.  A time-delay neural network architecture for isolated word recognition , 1990, Neural Networks.

[3]  D. Signorini,et al.  Neural networks , 1995, The Lancet.

[4]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[5]  P. Morasso,et al.  The analysis of continuous temporal sequences by a map of sequential leaky integrators , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[6]  Jukka Heikkonen,et al.  Time Series Predicition using Recurrent SOM with Local Linear Models , 1997 .

[7]  Thomas Martinetz,et al.  'Neural-gas' network for vector quantization and its application to time-series prediction , 1993, IEEE Trans. Neural Networks.

[8]  J. Nadal Non linear neurons in the low noise limit : a factorial code maximizes information transferJean , 1994 .

[9]  J. Nadal,et al.  Nonlinear neurons in the low-noise limit: a factorial code maximizes information transfer Network 5 , 1994 .

[10]  L. Glass,et al.  Oscillation and chaos in physiological control systems. , 1977, Science.

[11]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory , 1988 .

[12]  Yoshua Bengio,et al.  Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.

[13]  Thomas Voegtlin,et al.  Context quantization and contextual self-organizing maps , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[14]  José Carlos Príncipe,et al.  Spatio-temporal self-organizing feature maps , 1996, Proceedings of International Conference on Neural Networks (ICNN'96).

[15]  Teuvo Kohonen,et al.  Self-organization and associative memory: 3rd edition , 1989 .

[16]  Terry Caelli,et al.  Learning Temporal Sequences in Recurrent Self-Organising Neural Nets , 1997, Australian Joint Conference on Artificial Intelligence.

[17]  H. Ritter,et al.  Convergence properties of Kohonen's topology conserving maps: fluctuations, stability, and dimension selection , 1988, Biological Cybernetics.

[18]  Terrence J. Sejnowski,et al.  An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.

[19]  David A. Huffman,et al.  A method for the construction of minimum-redundancy codes , 1952, Proceedings of the IRE.

[20]  Risto Miikkulainen,et al.  SARDNET: A Self-Organizing Feature Map for Sequences , 1994, NIPS.

[21]  Jukka Heikkonen,et al.  Context Learning with the Self Organizing , 1997 .

[22]  Marc F. J. Drossaers,et al.  An Extended Kohonen Feature Map for Sentence Recognition , 1993 .

[23]  Gilles Pagès,et al.  Theoretical aspects of the SOM algorithm , 1998, Neurocomputing.