Probability distributions of the logarithm of inter-spike intervals yield accurate entropy estimates from small datasets

The maximal information that the spike train of any neuron can pass on to subsequent neurons can be quantified as the neuronal firing pattern entropy. Difficulties associated with estimating entropy from small datasets have proven an obstacle to the widespread reporting of firing pattern entropies and more generally, the use of information theory within the neuroscience community. In the most accessible class of entropy estimation techniques, spike trains are partitioned linearly in time and entropy is estimated from the probability distribution of firing patterns within a partition. Ample previous work has focused on various techniques to minimize the finite dataset bias and standard deviation of entropy estimates from under-sampled probability distributions on spike timing events partitioned linearly in time. In this manuscript we present evidence that all distribution-based techniques would benefit from inter-spike intervals being partitioned in logarithmic time. We show that with logarithmic partitioning, firing rate changes become independent of firing pattern entropy. We delineate the entire entropy estimation process with two example neuronal models, demonstrating the robust improvements in bias and standard deviation that the logarithmic time method yields over two widely used linearly partitioned time approaches.

[1]  Alan D Dorval,et al.  Synaptic input statistics tune the variability and reproducibility of neuronal responses. , 2006, Chaos.

[2]  H. Quastler Information theory in psychology , 1955 .

[3]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[4]  Shang‐keng Ma Calculation of entropy from data of motion , 1981 .

[5]  Robert Tibshirani,et al.  An Introduction to the Bootstrap , 1994 .

[6]  W. McCulloch,et al.  The limiting information capacity of a neuronal link , 1952 .

[7]  B. Harris The Statistical Estimation of Entropy in the Non-Parametric Case , 1975 .

[8]  M. E. J. Newman,et al.  Power laws, Pareto distributions and Zipf's law , 2005 .

[9]  A. Carlton On the bias of information estimates. , 1969 .

[10]  F. Sigworth,et al.  Data transformations for improved display and fitting of single-channel dwell time histograms. , 1987, Biophysical journal.

[11]  Dagmar von Helversen Gesang des Männchens und Lautschema des Weibchens bei der FeldheuschreckeChorthippus biguttulus (Orthoptera, Acrididae) , 1972, Journal of comparative physiology.

[12]  Liam Paninski,et al.  Estimation of Entropy and Mutual Information , 2003, Neural Computation.

[13]  A. Chao,et al.  Nonparametric estimation of Shannon’s index of diversity when there are unseen species in sample , 2004, Environmental and Ecological Statistics.

[14]  C. Koch,et al.  Encoding of visual information by LGN bursts. , 1999, Journal of neurophysiology.

[15]  J. L. Miller,et al.  Articulation Rate and Its Variability in Spontaneous Speech: A Reanalysis and Some Implications , 1984, Phonetica.

[16]  Joseph J. Pancrazio,et al.  Methods for characterizing interspike intervals and identifying bursts in neuronal activity , 2007, Journal of Neuroscience Methods.

[17]  Ga Miller,et al.  Note on the bias of information estimates , 1955 .

[18]  Fred Rieke,et al.  Coding Efficiency and Information Rates in Sensory Neurons , 1993 .

[19]  Naama Brenner,et al.  History-Dependent Multiple-Time-Scale Dynamics in a Single-Neuron Model , 2005, The Journal of Neuroscience.

[20]  M. Newman Power laws, Pareto distributions and Zipf's law , 2005 .

[21]  P. Sterling,et al.  How Much the Eye Tells the Brain , 2006, Current Biology.

[22]  Eugene M. Izhikevich,et al.  Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting , 2006 .

[23]  David R. Wolf,et al.  Estimating functions of probability distributions from a finite set of samples. , 1994, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[24]  Tim Gollisch,et al.  Time-warp invariant pattern detection with bursting neurons , 2008 .

[25]  Stefano Panzeri,et al.  The Upward Bias in Measures of Information Derived from Limited Data Samples , 1995, Neural Computation.

[26]  William Bialek,et al.  Entropy and Information in Neural Spike Trains , 1996, cond-mat/9603127.

[27]  J. Sawusch,et al.  Perceptual normalization for speaking rate: Effects of temporal distance , 1996, Perception & psychophysics.

[28]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .