Quantifying the information transmitted in a single stimulus

Information theory - in particular mutual information- has been widely used to investigate neural processing in various brain areas. Shannon mutual information quantifies how much information is, on average, contained in a set of neural activities about a set of stimuli. To extend a similar approach to single stimulus encoding, we need to introduce a quantity specific for a single stimulus. This quantity has been defined in literature by four different measures, but none of them satisfies the same intuitive properties (non-negativity, additivity), that characterize mutual information. We present here a detailed analysis of the different meanings and properties of these four definitions. We show that all these measures satisfy, at least, a weaker additivity condition, i.e. limited to the response set. This allows us to use them for analysing correlated coding, as we illustrate in a toy-example from hippocampal place cells.

[1]  William Bialek,et al.  Entropy and Information in Neural Spike Trains , 1996, cond-mat/9603127.

[2]  Christian K. Machens,et al.  Testing the Efficiency of Sensory Coding with Optimal Stimulus Ensembles , 2005, Neuron.

[3]  Daniel A Butts,et al.  How much information is associated with a particular stimulus? , 2003, Network.

[4]  G. Buzsáki,et al.  Spike train dynamics predicts theta-related phase precession in hippocampal pyramidal cells , 2002, Nature.

[5]  Michael J. Berry,et al.  Synergy, Redundancy, and Independence in Population Codes , 2003, The Journal of Neuroscience.

[7]  Alexander Borst,et al.  Information theory and neural coding , 1999, Nature Neuroscience.

[8]  Michele Bezzi,et al.  Measuring Information Spatial Densities , 2001, Neural Computation.

[9]  Xiaoqin Wang,et al.  Information content of auditory cortical responses to time-varying acoustic stimuli. , 2004, Journal of neurophysiology.

[10]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[11]  Eilon Vaadia,et al.  Learning-Induced Improvement in Encoding and Decoding of Specific Movement Directions by Neurons in the Primary Motor Cortex , 2004, PLoS biology.

[12]  Stefano Panzeri,et al.  Information in the Neuronal Representation of Individual Stimuli in the Primate Temporal Visual Cortex , 1997, Journal of Computational Neuroscience.

[13]  J. O’Keefe,et al.  Phase relationship between hippocampal place units and the EEG theta rhythm , 1993, Hippocampus.

[14]  R. Hartley Transmission of information , 1928 .

[15]  M R DeWeese,et al.  How to measure the information gained from one symbol. , 1999, Network.

[16]  Stefano Panzeri,et al.  The Upward Bias in Measures of Information Derived from Limited Data Samples , 1995, Neural Computation.

[17]  Mill Johannes G.A. Van,et al.  Transmission Of Information , 1961 .