Physics of the Shannon Limits

We provide a simple physical interpretation, in the context of the second law of thermodynamics, to the information inequality (a.k.a. the Gibbs inequality, which is also equivalent to the log-sum inequality), asserting that the relative entropy between two probability distributions cannot be negative. Since this inequality stands at the basis of the data processing theorem (DPT), and the DPT in turn is at the heart of most, if not all, proofs of converse theorems in Shannon theory, it is observed that conceptually, the roots of fundamental limits of information theory can actually be attributed to the laws of physics, in particular, the second law of thermodynamics, and indirectly, also the law of energy conservation. By the same token, in the other direction: one can view the second law as stemming from information-theoretic principles.

[1]  Yoshiyuki Kabashima,et al.  Statistical mechanics of typical set decoding. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[2]  Mehran Kardar,et al.  Statistical physics of particles , 2007 .

[3]  I Rojdestvenski,et al.  Mapping of statistical physics to information theory with application to biological systems. , 2000, Journal of theoretical biology.

[4]  E. Jaynes Information Theory and Statistical Mechanics , 1957 .

[5]  J. Parrondo,et al.  Dissipation: the phase-space perspective. , 2007, Physical review letters.

[6]  J. Cadzow Maximum Entropy Spectral Analysis , 2006 .

[7]  Rodney W. Johnson,et al.  Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.

[8]  D. Saad,et al.  Statistical mechanics of error-correcting codes , 1999 .

[9]  D. Saad,et al.  Tighter decoding reliability bound for Gallager's error-correcting code. , 2000, Physical review. E, Statistical, nonlinear, and soft matter physics.

[10]  Neri Merhav,et al.  Hidden Markov processes , 2002, IEEE Trans. Inf. Theory.

[11]  V. Akila,et al.  Information , 2001, The Lancet.

[12]  Martin J. Wainwright,et al.  A new class of upper bounds on the log partition function , 2002, IEEE Transactions on Information Theory.

[13]  Dov Levine,et al.  Nonequilibrium fluctuation theorems in the presence of local heating. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[14]  C. Jarzynski Nonequilibrium Equality for Free Energy Differences , 1996, cond-mat/9610209.

[15]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[16]  Albert-László Barabási,et al.  Statistical mechanics of complex networks , 2001, ArXiv.

[17]  O. J. E. Maroney The (absence of a) relationship between thermodynamic and logical reversibility , 2004 .

[18]  Raymond W. Yeung,et al.  A First Course in Information Theory , 2002 .

[19]  R. Landauer,et al.  Irreversibility and heat generation in the computing process , 1961, IBM J. Res. Dev..

[20]  Shlomo Shamai,et al.  Statistical Physics of Signal Estimation in Gaussian Noise: Theory and Examples of Phase Transitions , 2008, IEEE Transactions on Information Theory.

[21]  D. Saad,et al.  Error-correcting codes that nearly saturate Shannon's bound , 1999, cond-mat/9906011.

[22]  Yoshiyuki Kabashima,et al.  Statistical mechanics of source coding with a fidelity criterion , 2005 .

[23]  Nicolas Sourlas,et al.  Spin-glass models as error-correcting codes , 1989, Nature.

[24]  E. Jaynes On the rationale of maximum-entropy methods , 1982, Proceedings of the IEEE.

[25]  M. Wadati,et al.  Exactly Solvable Models in Statistical Mechanics , 1988 .

[26]  Dragan Samardzija,et al.  Some Analogies Between Thermodynamics and Shannon Theory , 2007, 2007 41st Annual Conference on Information Sciences and Systems.

[27]  Naoko Nakagawa,et al.  Representation of Nonequilibrium Steady States in Large Mechanical Systems , 2008, 0805.3023.

[28]  F. Guerra Spin Glasses , 2005, cond-mat/0507581.

[29]  Thierry Mora,et al.  Statistical mechanics of error exponents for error-correcting codes , 2006, Physical review. E, Statistical, nonlinear, and soft matter physics.

[30]  M. B. Plenio,et al.  The physics of forgetting: Landauer's erasure principle and information theory , 2001, quant-ph/0103108.

[31]  M. Mézard,et al.  Information, Physics, and Computation , 2009 .

[32]  N. Sourlas Spin Glasses, Error-Correcting Codes and Finite-Temperature Decoding , 1994 .

[33]  Raymond W. Yeung,et al.  A First Course in Information Theory (Information Technology: Transmission, Processing and Storage) , 2006 .

[34]  Robert M. Gray,et al.  Rate-distortion speech coding with a minimum discrimination information distortion measure , 1981, IEEE Trans. Inf. Theory.

[35]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[36]  Tatsuto Murayama Statistical mechanics of the data compression theorem , 2002 .

[37]  Yoshiyuki Kabashima,et al.  Statistical Mechanical Approach to Error Exponents of Lossy Data Compression , 2003 .

[38]  Christopher Jarzynski,et al.  Illustrative example of the relationship between dissipation and relative entropy. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[39]  W. Marsden I and J , 2012 .

[40]  Toshiyuki Tanaka,et al.  A statistical-mechanics approach to large-system analysis of CDMA multiuser detectors , 2002, IEEE Trans. Inf. Theory.

[41]  Toshiyuki Tanaka,et al.  Statistical mechanics of CDMA multiuser demodulation , 2001 .

[42]  B. Scoppola,et al.  Statistical Mechanics Approach to Coding Theory , 1999 .