CHAPTER 4 Information and communication in living systems

Living systems are able to communicate, i.e., they can exchange information. Communication processes are based upon an organization which is common to both man-made and biological systems. A first purpose of this chapter is to derive a number of concepts, such as that of entropy or self-information, defined as the monovariate moment of a distribution of the log of a probability of occurrence of an event. In this chapter, are also discussed the concepts of joint and conditional entropies, as well as that of mutual information, with the aim of showing that the very concept of information is based on the ability of a system (for instance a communication channel) to associate signs in a specific manner so as to allow communication of a message. The relations between communication and mapping, the subadditivity principle, and the laws of coding are also discussed. The subadditivity principle is of particular interest. It states that the joint entropy of the channel is at most equal (and in general smaller) to the sum of the entropies of the source and the destination. This principle prevents the channel from generating its own information. The last part of this chapter is devoted to the analysis of information transfer between DNA and proteins viewed as a communication channel.

[1]  J. Hopfield Kinetic proofreading: a new mechanism for reducing errors in biosynthetic processes requiring high specificity. , 1974, Proceedings of the National Academy of Sciences of the United States of America.

[2]  H. P. Yockey,et al.  Information Theory And Molecular Biology , 1992 .

[3]  J. R. Pierce,et al.  Symposium on Information Theory in Biology , 1959 .

[4]  F H Crick The genetic code. 3. , 1966, Scientific American.

[5]  Claude E. Shannon,et al.  The mathematical theory of communication , 1950 .

[6]  Amiel Feinstein,et al.  Foundations of Information Theory , 1959 .

[7]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[8]  W. W. Peterson,et al.  Error-Correcting Codes. , 1962 .

[9]  T. Yamano Information theory based on nonadditive information content. , 2000, Physical review. E, Statistical, nonlinear, and soft matter physics.

[10]  Solomon Kullback,et al.  Information Theory and Statistics , 1960 .

[11]  Robert G. Gallager,et al.  Low-density parity-check codes , 1962, IRE Trans. Inf. Theory.

[12]  Maurice G. Kendall,et al.  The advanced theory of statistics , 1945 .

[13]  M. Nirenberg The genetic code. II. , 1963, Scientific American.

[14]  P. Farabaugh Programmed translational frameshifting. , 1996, Annual review of genetics.

[15]  Jacob Wolfowitz Coding Theorems of Information Theory , 1962 .

[16]  F. Crick Central Dogma of Molecular Biology , 1970, Nature.

[17]  H. Atlan L'organisation biologique et la théorie de l'information , 1992 .

[18]  C. Tsallis Possible generalization of Boltzmann-Gibbs statistics , 1988 .

[19]  C. Tsallis On mixing and metaequilibrium in nonextensive systems , 2001 .

[20]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[21]  C. Tsallis,et al.  The role of constraints within generalized nonextensive statistics , 1998 .

[22]  Henry Margenau,et al.  The mathematics of physics and chemistry , 1943 .

[23]  H Atlan,et al.  The cellular computer DNA: program or data. , 1990, Bulletin of mathematical biology.

[24]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[25]  C. Anfinsen Principles that govern the folding of protein chains. , 1973, Science.

[26]  B. McMillan The Basic Theorems of Information Theory , 1953 .

[27]  A. Garen Sense and nonsense in the genetic code. Three exceptional triplets can serve as both chain-terminating signals and amino acid codons. , 1968, Science.

[28]  Francis Crick,et al.  The Genetic Code , 1962 .

[29]  S. Brenner,et al.  General Nature of the Genetic Code for Proteins , 1961, Nature.