Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations

A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.

[1]  G. Shepherd The Synaptic Organization of the Brain , 1979 .

[2]  Leon O. Chua,et al.  Fading memory and the problem of approximating nonlinear operators with volterra series , 1985 .

[3]  Gordon M. Shepherd,et al.  A basic circuit of cortical organization. , 1988 .

[4]  John V. Tucker,et al.  Can excitable media be considered as computational systems , 1991 .

[5]  Anders Krogh,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[6]  C. Lee Giles,et al.  Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks , 1992, Neural Computation.

[7]  Hava T. Siegelmann,et al.  Analog computation via neural networks , 1993, [1993] The 2nd Israel Symposium on Theory and Computing Systems.

[8]  Barak A. Pearlmutter Gradient calculations for dynamic recurrent neural networks: a survey , 1995, IEEE Trans. Neural Networks.

[9]  Peter Ford Dominey,et al.  A Model of Corticostriatal Plasticity for Learning Oculomotor Associations and Sequences , 1995, Journal of Cognitive Neuroscience.

[10]  M M Merzenich,et al.  Temporal information transformed into a spatial code by a neural network with realistic properties , 1995, Science.

[11]  Heikki Hyotyniemi,et al.  Turing Machines Are Recurrent Neural Networks , 1996 .

[12]  Wolfgang Maass,et al.  Lower Bounds for the Computational Power of Networks of Spiking Neurons , 1996, Neural Computation.

[13]  A. Grinvald,et al.  Dynamics of Ongoing Activity: Explanation of the Large Variability in Evoked Cortical Responses , 1996, Science.

[14]  John E. Savage,et al.  Models of computation - exploring the power of computing , 1998 .

[15]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[16]  Cristopher Moore,et al.  Dynamical Recognizers: Real-Time Language Recognition by Analog Computers , 1998, Theor. Comput. Sci..

[17]  P. Sterling The Synaptic Organization of the Brain , 1998 .

[18]  H. Markram,et al.  Differential signaling via the same axon of neocortical pyramidal neurons. , 1998, Proceedings of the National Academy of Sciences of the United States of America.

[19]  Eduardo D. Sontag,et al.  Analog Neural Nets with Gaussian or Other Common Noise Distributions Cannot Recognize Arbitrary Regular Languages , 1999, Neural Computation.

[20]  Eduardo D. Sontag,et al.  Neural Systems as Nonlinear Filters , 2000, Neural Computation.

[21]  H. Markram,et al.  t Synchrony Generation in Recurrent Networks with Frequency-Dependent Synapses , 2000, The Journal of Neuroscience.

[22]  Wolfgang Maass,et al.  On the Computational Power of Winner-Take-All , 2000, Neural Computation.

[23]  M. Sur,et al.  Visual behaviour mediated by retinal projections directed to the auditory pathway , 2000, Nature.

[24]  H. Markram,et al.  Organizing principles for a diversity of GABAergic interneurons and synapses in the neocortex. , 2000, Science.

[25]  J J Hopfield,et al.  What is a moment? Transient synchrony as a collective mechanism for spatiotemporal integration. , 2001, Proceedings of the National Academy of Sciences of the United States of America.

[26]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[27]  Michael H. Herzog,et al.  Local Interactions in Neural Networks Explain Global Effects in Gestalt Processing and Masking , 2003, Neural Computation.

[28]  Henry Markram,et al.  Perspectives of the high-dimensional dynamics of neural microcircuits from the point of view of low-dimensional readouts , 2003, Complex..

[29]  H. Markram,et al.  Coding and learning of behavioral sequences , 2004, Trends in Neurosciences.

[30]  Valentin P Zhigulin Dynamical motifs: building blocks of complex dynamics in sparsely connected random networks. , 2004, Physical review letters.