Dynamics of information and emergent computation in generic neural microcircuit models

Numerous methods have already been developed to estimate the information contained in single spike trains. In this article we explore efficient methods for estimating the information contained in the simultaneous firing activity of hundreds of neurons. Obviously such methods are needed to analyze data from multi-unit recordings. We test these methods on generic neural microcircuit models consisting of 800 neurons, and analyze the temporal dynamics of information about preceding spike inputs in such circuits. It turns out that information spreads with high speed in such generic neural microcircuit models, thereby supporting-without the postulation of any additional neural or synaptic mechanisms--the possibility of ultra-rapid computations on the first input spikes.

[1]  H. Markram,et al.  Organizing principles for a diversity of GABAergic interneurons and synapses in the neocortex. , 2000, Science.

[2]  G B Stanley,et al.  Reconstruction of Natural Scenes from Ensemble Responses in the Lateral Geniculate Nucleus , 1999, The Journal of Neuroscience.

[3]  William Bialek,et al.  Spikes: Exploring the Neural Code , 1996 .

[4]  William Bialek,et al.  Entropy and Information in Neural Spike Trains , 1996, cond-mat/9603127.

[5]  H. Jaeger Harnessing nonlinearity : predicting chaotic systems and boosting wireless communication . " ( Ref : 1091277 ) Refutation of Second Reviewer ' s Objections , 2005 .

[6]  H. Markram,et al.  Differential signaling via the same axon of neocortical pyramidal neurons. , 1998, Proceedings of the National Academy of Sciences of the United States of America.

[7]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[8]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[9]  S. Thorpe,et al.  Speed of processing in the human visual system , 1996, Nature.

[10]  M. Roulston Estimating the errors on measured entropy and mutual information , 1999 .

[11]  Liam Paninski,et al.  Estimation of Entropy and Mutual Information , 2003, Neural Computation.

[12]  E. Rolls,et al.  Speed of feedforward and recurrent processing in multilayer networks of integrate-and-fire neurons , 2001, Network.

[13]  Shigeo Abe DrEng Pattern Classification , 2001, Springer London.

[14]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[15]  Stefano Panzeri,et al.  A practical guide to information analysis of spike trains , 2003 .

[16]  John Hertz,et al.  Reading the Information in the Outcome of Neural Computation , 2022 .

[17]  Jerald D. Kralik,et al.  Real-time prediction of hand trajectory by ensembles of cortical neurons in primates , 2000, Nature.

[18]  Shang‐keng Ma Calculation of entropy from data of motion , 1981 .

[19]  E. Marder,et al.  Plasticity in single neuron and circuit computations , 2004, Nature.

[20]  Stefano Panzeri,et al.  Sensory coding and information transmission , 2002 .

[21]  David G. Stork,et al.  Pattern Classification (2nd ed.) , 1999 .

[22]  Alexander Borst,et al.  Information theory and neural coding , 1999, Nature Neuroscience.

[23]  Stefano Panzeri,et al.  Analytical estimates of limited sampling biases in different information measures. , 1996, Network.

[24]  A. Zador,et al.  Neural representation and the cortical code. , 2000, Annual review of neuroscience.