How chaos boosts the encoding capacity of small recurrent neural networks : learning consideration

So far, recurrent networks, when adopting fixed point dynamics, show a very poor encoding capacity. However, these same networks, when preferentially maintained in chaotic dynamics, can encode an enormous amount of information in their cyclic attractors and this boosts their encoding capacity. It has been described in a previous paper a simple way to encode such information by robustly associating each vector in a N-dimensional space with one "symbolic" cyclic attractor. The main message was the monotonous increase of chaotic spontaneous regimes as a function of the number of attractors to learn. However, no algorithm was provided to adjust the connection's weight in order to encode a given input set. For this purpose, this paper revisits the classical gradient-based BPTT learning algorithm. It shows that this algorithm gives poor results and furthermore that by using it the "chaoticity" of the network dampens strongly, hence it's encoding capacity.

[1]  E. Ott Chaos in Dynamical Systems: Contents , 1993 .

[2]  F. Pasemann Complex dynamics and the structure of small neural networks , 2002 .

[3]  B. Hao,et al.  Symbolic dynamics and characterization of complexity , 1991 .

[4]  W. Freeman,et al.  How brains make chaos in order to make sense of the world , 1987, Behavioral and Brain Sciences.

[5]  W. Freeman Simulation of chaotic EEG patterns with a dynamic model of the olfactory system , 1987, Biological Cybernetics.

[6]  Philipp Slusallek,et al.  Introduction to real-time ray tracing , 2005, SIGGRAPH Courses.

[7]  C. Molter,et al.  Fascinating rhythms by chaotic Hopfield networks , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[8]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[9]  Hava T. Siegelmann,et al.  On the Computational Power of Neural Nets , 1995, J. Comput. Syst. Sci..

[10]  F. Pasemann Complex dynamics and the structure of small neural networks , 2002, Network.

[11]  H. Bersini,et al.  Frustrated chaos in biological networks. , 1997, Journal of theoretical biology.

[12]  Shin Ishii,et al.  A network of chaotic elements for information processing , 1996, Neural Networks.

[13]  John F. Kolen,et al.  Understanding and Explaining DRN Behavior , 2001 .

[14]  Werner Ebeling,et al.  Entropy of symbolic sequences: the role of correlations , 1991 .

[15]  A Babloyantz,et al.  Computation with chaos: a paradigm for cortical activity. , 1994, Proceedings of the National Academy of Sciences of the United States of America.

[16]  F. Pasemann DYNAMICS OF A SINGLE MODEL NEURON , 1993 .

[17]  I. Tsuda Toward an interpretation of dynamic neural activity in terms of chaotic dynamical systems. , 2001, The Behavioral and brain sciences.

[18]  Ichiro Tsuda,et al.  Towards an interpretation of dynamic neural activity in terms of chaotic dynamical systems , 2000 .

[19]  Eduardo Sontag,et al.  Computational power of neural networks , 1995 .

[20]  T. Gelder,et al.  The dynamical hypothesis in cognitive science , 1998, Behavioral and Brain Sciences.

[21]  Danil V. Prokhorov,et al.  Enhanced Multi-Stream Kalman Filter Training for Recurrent Networks , 1998 .

[22]  A. Wolf,et al.  Determining Lyapunov exponents from a time series , 1985 .

[23]  Bruno Cessac,et al.  Self-organization and dynamics reduction in recurrent networks: stimulus presentation and learning , 1998, Neural Networks.

[24]  F. Varela,et al.  Perception's shadow: long-distance synchronization of human brain activity , 1999, Nature.

[25]  Robert Kozma,et al.  Chaotic Resonance - Methods and Applications for Robust Classification of noisy and Variable Patterns , 2001, Int. J. Bifurc. Chaos.

[26]  W. Singer,et al.  Oscillatory responses in cat visual cortex exhibit inter-columnar synchronization which reflects global stimulus properties , 1989, Nature.