Physics of the Shannon limits

We provide a simple physical interpretation, in the context of the second law of thermodynamics, to the information inequality (a.k.a. the Gibbs' inequality, which is also equivalent to the log-sum inequality), asserting that the relative entropy between two probability distributions cannot be negative. Since this inequality stands at the basis of the data processing theorem (DPT), and the DPT in turn is at the heart of most, if not all, proofs of converse theorems in Shannon theory, it is observed that conceptually, the roots of fundamental limits of Information Theory can actually be attributed to the laws of physics, in particular, to the second law of thermodynamics, and at least indirectly, also to the law of energy conservation. By the same token, in the other direction: one can view the second law as stemming from information-theoretic principles.

[1]  Christopher Jarzynski,et al.  Illustrative example of the relationship between dissipation and relative entropy. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[2]  C. Jarzynski Nonequilibrium Equality for Free Energy Differences , 1996, cond-mat/9610209.

[3]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[4]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[5]  Naoko Nakagawa,et al.  Representation of Nonequilibrium Steady States in Large Mechanical Systems , 2008, 0805.3023.

[6]  M. Wadati,et al.  Exactly Solvable Models in Statistical Mechanics , 1988 .

[7]  J. Parrondo,et al.  Dissipation: the phase-space perspective. , 2007, Physical review letters.

[8]  Mehran Kardar,et al.  Statistical physics of particles , 2007 .

[9]  Dov Levine,et al.  Nonequilibrium fluctuation theorems in the presence of local heating. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.