Mutual information and redundancy in spontaneous communication between cortical neurons

An important question in neural information processing is how neurons cooperate to transmit information. To study this question, we resort to the concept of redundancy in the information transmitted by a group of neurons and, at the same time, we introduce a novel concept for measuring cooperation between pairs of neurons called relative mutual information (RMI). Specifically, we studied these two parameters for spike trains generated by neighboring neurons from the primary visual cortex in the awake, freely moving rat. The spike trains studied here were spontaneously generated in the cortical network, in the absence of visual stimulation. Under these conditions, our analysis revealed that while the value of RMI oscillated slightly around an average value, the redundancy exhibited a behavior characterized by a higher variability. We conjecture that this combination of approximately constant RMI and greater variable redundancy makes information transmission more resistant to noise disturbances. Furthermore, the redundancy values suggest that neurons can cooperate in a flexible way during information transmission. This mostly occurs via a leading neuron with higher transmission rate or, less frequently, through the information rate of the whole group being higher than the sum of the individual information rates—in other words in a synergetic manner. The proposed method applies not only to the stationary, but also to locally stationary neural signals.

[1]  William Bialek,et al.  Synergy in a Neural Code , 2000, Neural Computation.

[2]  Alexander Borst,et al.  Information theory and neural coding , 1999, Nature Neuroscience.

[3]  Claude E. Shannon,et al.  The Mathematical Theory of Communication , 1950 .

[4]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[5]  Abraham Lempel,et al.  On the Complexity of Finite Sequences , 1976, IEEE Trans. Inf. Theory.

[6]  Michael J. Berry,et al.  Redundancy in the Population Code of the Retina , 2005, Neuron.

[7]  P. Mahalanobis On the generalized distance in statistics , 1936 .

[8]  Eero P. Simoncelli,et al.  Natural image statistics and neural representation. , 2001, Annual review of neuroscience.

[9]  Matthew E. Larkum,et al.  Predicting the synaptic information efficacy in cortical layer 5 pyramidal neurons using a minimal integrate-and-fire model , 2008, Biological Cybernetics.

[10]  F. Mechler,et al.  Independent and Redundant Information in Nearby Cortical Neurons , 2001, Science.

[11]  Michael J. Berry,et al.  Synergy, Redundancy, and Independence in Population Codes , 2003, The Journal of Neuroscience.

[12]  Idan Segev,et al.  The information efficacy of a synapse , 2002, Nature Neuroscience.

[13]  P. Latham,et al.  Synergy, Redundancy, and Independence in Population Codes, Revisited , 2005, The Journal of Neuroscience.

[14]  José María Amigó,et al.  Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity , 2004, Neural Computation.

[15]  José M Amigó,et al.  Variance estimators for the Lempel-Ziv entropy rate estimator. , 2006, Chaos.

[16]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[17]  H Barlow,et al.  Redundancy reduction revisited , 2001, Network.

[18]  R. Durrett Probability: Theory and Examples , 1993 .

[19]  TJ Gawne,et al.  How independent are the messages carried by adjacent inferior temporal cortical neurons? , 1993, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[20]  Edmund T. Rolls,et al.  Information encoding in the inferior temporal visual cortex: contributions of the firing rates and the correlations between the firing of neurons , 2010, Biological cybernetics.

[21]  G. Paxinos,et al.  The Rat Brain in Stereotaxic Coordinates , 1983 .

[22]  Liam Paninski,et al.  Estimation of Entropy and Mutual Information , 2003, Neural Computation.

[23]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[24]  Yuri M. Suhov,et al.  Nonparametric Entropy Estimation for Stationary Processesand Random Fields, with Applications to English Text , 1998, IEEE Trans. Inf. Theory.

[25]  Christian K. Machens,et al.  Representation of Acoustic Communication Signals by Insect Auditory Receptor Neurons , 2001, The Journal of Neuroscience.

[26]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[27]  Jonathon Shlens,et al.  Estimating Entropy Rates with Bayesian Confidence Intervals , 2005, Neural Computation.

[28]  Mikko Juusola,et al.  Stimulus History Reliably Shapes Action Potential Waveforms of Cortical Neurons , 2005, The Journal of Neuroscience.

[29]  William Bialek,et al.  Spikes: Exploring the Neural Code , 1996 .

[30]  William Bialek,et al.  Entropy and Information in Neural Spike Trains , 1996, cond-mat/9603127.

[31]  Abraham Lempel,et al.  Compression of individual sequences via variable-rate coding , 1978, IEEE Trans. Inf. Theory.

[32]  E T Rolls,et al.  Correlations and the encoding of information in the nervous system , 1999, Proceedings of the Royal Society of London. Series B: Biological Sciences.

[33]  Maria V. Sanchez-Vives,et al.  Application of Lempel–Ziv complexity to the analysis of neural discharges , 2003, Network.