Reliable biological communication with realistic constraints.

Communication in biological systems must deal with noise and metabolic or temporal constraints. We include these constraints into information theory to obtain the distributions of signal usage corresponding to a maximal rate of information transfer given any noise structure and any constraints. Generalized versions of the Boltzmann, Gaussian, or Poisson distributions are obtained for linear, quadratic and temporal constraints, respectively. These distributions are shown to imply that biological transformations must dedicate a larger output range to the more probable inputs and less to the outputs with higher noise and higher participation in the constraint. To show the general theory of reliable communication at work, we apply these results to biochemical and neuronal signaling. Noncooperative enzyme kinetics is shown to be suited for transfer of a high signal quality when the input distribution has a maximum at low concentrations while cooperative kinetics for near-Gaussian input statistics. Neuronal codes based on spike rates, spike times or bursts have to balance signal quality and cost-efficiency and at the network level imply sparseness and uncorrelation within the limits of noise, cost, and processing operations.

[1]  Richard E. Blahut,et al.  Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.

[2]  William Bialek,et al.  Spikes: Exploring the Neural Code , 1996 .

[3]  William B. Levy,et al.  Energy Efficient Neural Codes , 1996, Neural Computation.

[4]  G. Garcia de Polavieja Errors Drive the Evolution of Biological Signalling to Costly Codes , 2001 .

[5]  Simon B. Laughlin,et al.  Energy-Efficient Coding with Discrete Stochastic Events , 2002, Neural Computation.

[6]  Stefano Panzeri,et al.  Firing Rate Distributions and Efficiency of Information Transmission of Inferior Temporal Cortex Neurons to Natural Visual Stimuli , 1999, Neural Computation.

[7]  Michael J. Berry,et al.  Metabolically Efficient Information Processing , 2001, Neural Computation.

[8]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[9]  Michael J. Berry,et al.  A test of metabolically efficient coding in the retina , 2002, Network.

[10]  John E. R. Staddon,et al.  Optima for animals , 1982 .

[11]  勇一 作村,et al.  Biophysics of Computation , 2001 .

[12]  Rob R. de Ruyter van Steveninck,et al.  The metabolic cost of neural information , 1998, Nature Neuroscience.

[13]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[14]  J. H. Hateren,et al.  Theoretical predictions of spatiotemporal receptive fields of fly LMCs, and experimental validation , 1992, Journal of Comparative Physiology A.

[15]  Pranab Kumar Sen,et al.  Statistics and Decisions , 2006 .

[16]  J. H. van Hateren,et al.  Real and optimal neural images in early vision , 1992, Nature.

[17]  Suguru Arimoto,et al.  An algorithm for computing the capacity of arbitrary discrete memoryless channels , 1972, IEEE Trans. Inf. Theory.

[18]  S. Laughlin A Simple Coding Procedure Enhances a Neuron's Information Capacity , 1981, Zeitschrift fur Naturforschung. Section C, Biosciences.

[19]  J. White,et al.  Channel noise in neurons , 2000, Trends in Neurosciences.

[20]  J. H. van Hateren,et al.  A theory of maximizing sensory information , 2004, Biological Cybernetics.

[21]  L. Abbott,et al.  Responses of neurons in primary and inferior temporal visual cortices to natural scenes , 1997, Proceedings of the Royal Society of London. Series B: Biological Sciences.