Information Processing Capacity of Dynamical Systems

Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory.

[1]  Surya Ganguli,et al.  Memory traces in dynamical systems , 2008, Proceedings of the National Academy of Sciences.

[2]  L Pesquera,et al.  Photonic information processing beyond Turing: an optoelectronic implementation of reservoir computing. , 2012, Optics express.

[3]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[4]  J. E. Pearson Complex Patterns in a Simple System , 1993, Science.

[5]  N. Wiener,et al.  Nonlinear Problems in Random Theory , 1964 .

[6]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[7]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[8]  Benjamin Schrauwen,et al.  Toward optical signal processing using photonic reservoir computing. , 2008, Optics express.

[9]  Michael A. Arbib,et al.  The handbook of brain theory and neural networks , 1995, A Bradford book.

[10]  Christopher G. Langton,et al.  Computation at the edge of chaos: Phase transitions and emergent computation , 1990 .

[11]  Peter Michael Young,et al.  A tighter bound for the echo state property , 2006, IEEE Transactions on Neural Networks.

[12]  Benjamin Schrauwen,et al.  Memory in linear recurrent neural networks in continuous time , 2010, Neural Networks.

[13]  Phil Husbands,et al.  The Evolution of Reaction-Diffusion Controllers for Minimally Cognitive Agents , 2010, Artificial Life.

[14]  Fumiya Iida,et al.  New Robotics: Design Principles for Intelligent Systems , 2005, Artificial Life.

[15]  L. Appeltant,et al.  Information processing using a single dynamical node as complex system , 2011, Nature communications.

[16]  José Carlos Príncipe,et al.  Analysis and Design of Echo State Networks , 2007, Neural Computation.

[17]  Robert M. May,et al.  Simple mathematical models with very complicated dynamics , 1976, Nature.

[18]  Benjamin Schrauwen,et al.  Optoelectronic Reservoir Computing , 2011, Scientific Reports.

[19]  Albert D. Carlson,et al.  The Handbook of Brain Theory and Neural Networks.Second Edition. Edited byMichael A Arbib.A Bradford Book. Cambridge (Massachusetts): MIT Press. $165.00. xvii + 1290 p; ill.; index. ISBN: 0–262–01197–2. 2003. , 2003 .

[20]  Haim Sompolinsky,et al.  Short-term memory in orthogonal neural networks. , 2004, Physical review letters.

[21]  Leon O. Chua,et al.  Fading memory and the problem of approximating nonlinear operators with volterra series , 1985 .

[22]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[23]  Mark A Bedau,et al.  Artificial life: more than just building and studying computational systems. , 2005, Artificial life.

[24]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[25]  P. Gray,et al.  Sustained oscillations and other exotic patterns of behavior in isothermal reactions , 1985 .

[26]  W. Maass,et al.  State-dependent computations: spatiotemporal processing in cortical networks , 2009, Nature Reviews Neuroscience.

[27]  Y. W. Lee,et al.  Measurement of the Wiener Kernels of a Non-linear System by Cross-correlation† , 1965 .

[28]  Helmut Hauser,et al.  Towards a theoretical foundation for morphological computation with compliant bodies , 2011, Biological Cybernetics.