2007 Special Issue: Edge of chaos and prediction of computational performance for neural circuit models

We analyze in this article the significance of the edge of chaos for real-time computations in neural microcircuit models consisting of spiking neurons and dynamic synapses. We find that the edge of chaos predicts quite well those values of circuit parameters that yield maximal computational performance. But obviously it makes no prediction of their computational performance for other parameter values. Therefore, we propose a new method for predicting the computational performance of neural microcircuit models. The new measure estimates directly the kernel property and the generalization capability of a neural microcircuit. We validate the proposed measure by comparing its prediction with direct evaluations of the computational performance of various neural microcircuit models. The proposed method also allows us to quantify differences in the computational performance and generalization capability of neural circuits in different dynamic regimes (UP- and DOWN-states) that have been demonstrated through intracellular recordings in vivo.

[1]  A. Destexhe,et al.  The high-conductance state of neocortical neurons in vivo , 2003, Nature Reviews Neuroscience.

[2]  Henry Markram,et al.  Fading memory and kernel properties of generic cortical microcircuit models , 2004, Journal of Physiology-Paris.

[3]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[4]  Michael A. Arbib,et al.  The handbook of brain theory and neural networks , 1995, A Bradford book.

[5]  Christopher G. Langton,et al.  Computation at the edge of chaos: Phase transitions and emergent computation , 1990 .

[6]  H. Kantz,et al.  Nonlinear time series analysis , 1997 .

[7]  Peter L. Bartlett,et al.  Vapnik-Chervonenkis dimension of neural nets , 2003 .

[8]  James P. Crutchfield,et al.  Revisiting the Edge of Chaos: Evolving Cellular Automata to Perform Computations , 1993, Complex Syst..

[9]  Hava T. Siegelmann,et al.  Analog computation via neural networks , 1993, [1993] The 2nd Israel Symposium on Theory and Computing Systems.

[10]  Eduardo D. Sontag,et al.  A Precise Characterization of the Class of Languages Recognized by Neural Nets under Gaussian and Other Common Noise Distributions , 1998, NIPS.

[11]  Terrence J. Sejnowski,et al.  What Makes a Dynamical System Computationally Powerful , 2007 .

[12]  W. Gerstner,et al.  Signal buffering in random networks of spiking neurons: microscopic versus macroscopic phenomena. , 2005, Physical review. E, Statistical, nonlinear, and soft matter physics.

[13]  Pekka Orponen,et al.  On the Effect of Analog Noise in Discrete-Time Analog Computations , 1996, Neural Computation.

[14]  Robert A. Legenstein,et al.  Methods for Estimating the Computational Power and Generalization Capability of Neural Microcircuits , 2004, NIPS.

[15]  Nils Bertschinger,et al.  Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks , 2004, Neural Computation.

[16]  M. Carandini,et al.  Stimulus dependence of two-state fluctuations of membrane potential in cat visual cortex , 2000, Nature Neuroscience.

[17]  Wolfgang Maass,et al.  Cerebral Cortex Advance Access published February 15, 2006 A Statistical Analysis of Information- Processing Properties of Lamina-Specific , 2022 .

[18]  W. Maass,et al.  What makes a dynamical system computationally powerful ? , 2022 .

[19]  H. Markram,et al.  Differential signaling via the same axon of neocortical pyramidal neurons. , 1998, Proceedings of the National Academy of Sciences of the United States of America.

[20]  Henry Markram,et al.  On the Computational Power of Recurrent Circuits of Spiking Neurons , 2002, Electron. Colloquium Comput. Complex..

[21]  Michael F. Shlesinger,et al.  Dynamic patterns in complex systems , 1988 .

[22]  B. Derrida Dynamical phase transition in nonsymmetric spin glasses , 1987 .

[23]  Henry Markram,et al.  On the computational power of circuits of spiking neurons , 2004, J. Comput. Syst. Sci..

[24]  Vladimir Cherkassky,et al.  Learning from Data: Concepts, Theory, and Methods , 1998 .

[25]  Stuart A. Kauffman,et al.  The origins of order , 1993 .

[26]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[27]  Andrea Hasenstaub,et al.  Barrages of Synaptic Activity Control the Gain and Sensitivity of Cortical Neurons , 2003, The Journal of Neuroscience.