Optimum neural tuning curves for information efficiency with rate coding and finite-time window

An important question for neural encoding is what kind of neural systems can convey more information with less energy within a finite time coding window. This paper first proposes a finite-time neural encoding system, where the neurons in the system respond to a stimulus by a sequence of spikes that is assumed to be Poisson process and the external stimuli obey normal distribution. A method for calculating the mutual information of the finite-time neural encoding system is proposed and the definition of information efficiency is introduced. The values of the mutual information and the information efficiency obtained by using Logistic function are compared with those obtained by using other functions and it is found that Logistic function is the best one. It is further found that the parameter representing the steepness of the Logistic function has close relationship with full entropy, and that the parameter representing the translation of the function associates with the energy consumption and noise entropy tightly. The optimum parameter combinations for Logistic function to maximize the information efficiency are calculated when the stimuli and the properties of the encoding system are varied respectively. Some explanations for the results are given. The model and the method we proposed could be useful to study neural encoding system, and the optimum neural tuning curves obtained in this paper might exhibit some characteristics of a real neural system.

[1]  Petr Lánský,et al.  Information capacity and its approximations under metabolic cost in a simple homogeneous population of neurons , 2013, Biosyst..

[2]  A d'Anjou,et al.  Energy and information in Hodgkin-Huxley neurons. , 2011, Physical review. E, Statistical, nonlinear, and soft matter physics.

[3]  Ron Meir,et al.  Error-Based Analysis of Optimal Tuning Functions Explains Phenomena Observed in Sensory Neurons , 2010, Front. Comput. Neurosci..

[4]  Eero P. Simoncelli,et al.  Efficient Sensory Encoding and Bayesian Inference with Heterogeneous Neural Populations , 2014, Neural Computation.

[5]  Alexander Borst,et al.  Information theory and neural coding , 1999, Nature Neuroscience.

[6]  Edmund T. Rolls,et al.  The neuronal encoding of information in the brain , 2011, Progress in Neurobiology.

[7]  Kamil Ugurbil,et al.  Development of 17O NMR approach for fast imaging of cerebral metabolic rate of oxygen in rat brain at high field , 2002, Proceedings of the National Academy of Sciences of the United States of America.

[8]  M. Bethge,et al.  Second order phase transition in neural rate coding: binary encoding is optimal for rapid signal transmission. , 2003, Physical review letters.

[9]  Shiro Ikeda,et al.  Capacity of a Single Spiking Neuron Channel , 2009, Neural Computation.

[10]  Wiktor Mlynarski,et al.  Efficient coding of spectrotemporal binaural sounds leads to emergence of the auditory space representation , 2013, Front. Comput. Neurosci..

[11]  M. McDonnell,et al.  Maximally informative stimuli and tuning curves for sigmoidal rate-coding neurons and populations. , 2008, Physical review letters.

[12]  M. Bethge,et al.  Optimal neural rate coding leads to bimodal firing rate distributions. , 2003, Network.

[13]  Bartlett W. Mel,et al.  Pyramidal Neuron as Two-Layer Neural Network , 2003, Neuron.

[14]  Simon B. Laughlin,et al.  Consequences of Converting Graded to Action Potentials upon Neural Information Coding and Energy Efficiency , 2014, PLoS Comput. Biol..

[15]  Peter Dayan,et al.  Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems , 2001 .

[16]  Mark D McDonnell,et al.  Neural population coding is optimized by discrete tuning curves. , 2008, Physical review letters.

[17]  Mark D. McDonnell,et al.  An introductory review of information theory in the context of computational neuroscience , 2011, Biological Cybernetics.