Classification of stationary neuronal activity according to its information rate

We propose a measure of the information rate of a single stationary neuronal activity with respect to the state of null information. The measure is based on the Kullback–Leibler distance between two interspike interval distributions. The selected activity is compared with the Poisson model with the same mean firing frequency. We show that the approach is related to the notion of specific information and that the method allows us to judge the relative encoding efficiency. Two classes of neuronal activity models are classified according to their information rate: the renewal process models and the first-order Markov chain models. It has been proven that information can be transmitted changing neither the spike rate nor the coefficient of variation and that the increase in serial correlation does not necessarily increase the information gain. We employ the simple, but powerful, Vasicek's estimator of differential entropy to illustrate an application on the experimental data coming from olfactory sensory neurons of rats.

[1]  E. D. Adrian,et al.  The Basis of Sensation , 1928, The Indian Medical Gazette.

[2]  Charles F. Hockett,et al.  A mathematical theory of communication , 1948, MOCO.

[3]  B. Mandelbrot,et al.  RANDOM WALK MODELS FOR THE SPIKE ACTIVITY OF A SINGLE NEURON. , 1964, Biophysical journal.

[4]  David R. Cox,et al.  The statistical analysis of series of events , 1966 .

[5]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[6]  D. Cox,et al.  The statistical analysis of series of events , 1966 .

[7]  T. H. Bullock,et al.  Neutral coding - A report based on an NRP work session , 1968 .

[8]  D. Lampard A stochastic process whose successive intervals between events form a first order Markov chain — I , 1968, Journal of Applied Probability.

[9]  F. Downton Bivariate Exponential Distributions in Reliability Theory , 1970 .

[10]  Oldrich A Vasicek,et al.  A Test for Normality Based on Sample Entropy , 1976 .

[11]  A. J. Lawrance,et al.  An exponential moving-average sequence and point process (EMA1) , 1977, Journal of Applied Probability.

[12]  A. Tarantola,et al.  Inverse problems = Quest for information , 1982 .

[13]  Professor Moshe Abeles,et al.  Local Cortical Circuits , 1982, Studies of Brain Function.

[14]  D. Georgescauld Local Cortical Circuits, An Electrophysiological Study , 1983 .

[15]  Marc S. Fuller,et al.  An Information-Theoretic Analysis of Cutaneous Receptor Responses , 1984, IEEE Transactions on Biomedical Engineering.

[16]  William Bialek,et al.  Reading a Neural Code , 1991, NIPS.

[17]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[18]  Nader Ebrahimi,et al.  Testing exponentiality based on Kullback-Leibler information , 1992 .

[19]  G. Mandl,et al.  Coding for stimulus velocity by temporal patterning of spike discharges in visual cells of cat superior colliculus , 1993, Vision Research.

[20]  A. Tsybakov,et al.  Root-N consistent estimators of entropy for densities with unbounded support , 1994, Proceedings of 1994 Workshop on Information Theory and Statistics.

[21]  D. M. Green,et al.  A panoramic code for sound location by cortical neurons. , 1994, Science.

[22]  William Bialek,et al.  Spikes: Exploring the Neural Code , 1996 .

[23]  L. Györfi,et al.  Nonparametric entropy estimation. An overview , 1997 .

[24]  A. Zador Impact of synaptic unreliability on the information transmitted by spiking neurons. , 1998, Journal of neurophysiology.

[25]  Yutaka Sakai,et al.  Temporally correlated inputs to leaky integrate-and-fire models can reproduce spiking statistics of cortical neurons , 1999, Neural Networks.

[26]  T. Albright,et al.  Gauging sensory representations in the brain , 1999, Trends in Neurosciences.

[27]  Alexander Borst,et al.  Information theory and neural coding , 1999, Nature Neuroscience.

[28]  Petr Lánský,et al.  Two-compartment stochastic model of a neuron , 1999 .

[29]  M R DeWeese,et al.  How to measure the information gained from one symbol. , 1999, Network.

[30]  L. Maler,et al.  Negative Interspike Interval Correlations Increase the Neuronal Capacity for Encoding Time-Dependent Stimuli , 2001, The Journal of Neuroscience.

[31]  Wulfram Gerstner,et al.  Spiking Neuron Models , 2002 .

[32]  P. Duchamp-Viret,et al.  Single olfactory sensory neurons simultaneously integrate the components of an odour mixture , 2003, The European journal of neuroscience.

[33]  John W. Fisher,et al.  ICA Using Spacings Estimates of Entropy , 2003, J. Mach. Learn. Res..

[34]  André Longtin,et al.  The effects of spontaneous activity, background noise, and the stimulus ensemble on information transfer in neurons , 2003, Network.

[35]  Michael W. Levine,et al.  The distribution of the intervals between neural impulses in the maintained discharges of retinal ganglion cells , 1991, Biological Cybernetics.

[36]  Laura Sacerdote,et al.  Mean Instantaneous Firing Frequency Is Always Higher Than the Firing Rate , 2004, Neural Computation.

[37]  John P. Miller,et al.  Temporal encoding in nervous systems: A rigorous definition , 1995, Journal of Computational Neuroscience.

[38]  Benjamin Lindner,et al.  Interspike interval statistics of neurons driven by colored noise. , 2004, Physical review. E, Statistical, nonlinear, and soft matter physics.

[39]  J. Rospars,et al.  Patterns of spontaneous activity in single rat olfactory receptor neurons are different in normally breathing and tracheotomized animals. , 2005, Journal of neurobiology.

[40]  Petr Lánský,et al.  Similarity of interspike interval distributions and information gain in a stationary neuronal firing , 2006, Biological Cybernetics.

[41]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .