Information Theory and Statistical Physics - Lecture Notes

This document consists of lecture notes for a graduate course, which focuses on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, as well as to graduate students in Physics who have basic background in Information Theory. Strong emphasis is given to the analogy and parallelism between Information Theory and Statistical Physics, as well as to the insights, the analysis tools and techniques that can be borrowed from Statistical Physics and `imported' to certain problem areas in Information Theory. This is a research trend that has been very active in the last few decades, and the hope is that by exposing the student to the meeting points between these two disciplines, we will enhance his/her background and perspective to carry out research in the field. A short outline of the course is as follows: Introduction; Elementary Statistical Physics and its Relation to Information Theory; Analysis Tools in Statistical Physics; Systems of Interacting Particles and Phase Transitions; The Random Energy Model (REM) and Random Channel Coding; Additional Topics (optional).

[1]  Mehran Kardar,et al.  Statistical physics of particles , 2007 .

[2]  Michael J. W. Hall,et al.  Universal geometric approach to uncertainty, entropy, and information , 1999 .

[3]  Neri Merhav Relations Between Random Coding Exponents and the Statistical Physics of Random Codes , 2008, IEEE Transactions on Information Theory.

[4]  E. M.,et al.  Statistical Mechanics , 2021, On Complementarity.

[5]  B. Derrida Random-Energy Model: Limit of a Family of Disordered Models , 1980 .

[6]  Ruján Finite temperature error-correcting codes. , 1993, Physical review letters.

[7]  H Qian Relative entropy: free energy associated with equilibrium fluctuations and nonequilibrium deviations. , 2001, Physical review. E, Statistical, nonlinear, and soft matter physics.

[8]  Neri Merhav,et al.  The random energy model in a magnetic field and joint source–channel coding , 2008, 0803.2789.

[9]  M. Kendall Elementary Statistics , 1945, Nature.

[10]  R. Gallager Information Theory and Reliable Communication , 1968 .

[11]  A. E. Patrick,et al.  Directed polymers on trees: a martingale approach , 1993 .

[12]  Hugo Touchette,et al.  Methods for calculating nonconcave entropies , 2010, 1003.0382.

[13]  J. Parrondo,et al.  Dissipation: the phase-space perspective. , 2007, Physical review letters.

[14]  D. Haar,et al.  Statistical Physics , 1971, Nature.

[15]  Neri Merhav,et al.  Error Exponents of Optimum Decoding for the Interference Channel , 2010, IEEE Transactions on Information Theory.

[16]  M. Mézard,et al.  Information, Physics, and Computation , 2009 .

[17]  Entropy, Order Parameters, and Complexity: Incorporating the last 50 years into the statistical mechanics curriculum , 2007 .

[18]  B. Derrida Random-energy model: An exactly solvable model of disordered systems , 1981 .

[19]  J. Rud Nielsen,et al.  Elementary Statistical Physics. , 1959 .

[20]  西森 秀稔 Statistical physics of spin glasses and information processing : an introduction , 2001 .

[21]  F. Reif,et al.  Fundamentals of Statistical and Thermal Physics , 1965 .

[22]  P. Moran,et al.  Reversibility and Stochastic Networks , 1980 .

[23]  B. Derrida A generalization of the Random Energy Model which includes correlations between energies , 1985 .

[24]  Neri Merhav,et al.  Error Exponents for Broadcast Channels With Degraded Message Sets , 2011, IEEE Transactions on Information Theory.

[25]  Bernard Derrida,et al.  Solution of the generalised random energy model , 1986 .

[26]  Alexander Barg,et al.  Random codes: Minimum distances and error exponents , 2002, IEEE Trans. Inf. Theory.

[27]  Neri Merhav,et al.  Error exponents of optimum decoding for the degraded broadcast channel using moments of type class enumerators , 2008, 2009 IEEE International Symposium on Information Theory.

[28]  Josef Honerkamp,et al.  Statistical Physics: An Advanced Approach with Applications , 1998 .

[29]  N. D. Bruijn Asymptotic methods in analysis , 1958 .

[30]  Neri Merhav,et al.  Exact Random Coding Exponents for Erasure Decoding , 2011, IEEE Transactions on Information Theory.

[31]  Neri Merhav The Generalized Random Energy Model and its Application to the Statistical Physics of Ensembles of Hierarchical Codes , 2009, IEEE Transactions on Information Theory.

[32]  Bernard Derrida,et al.  The random energy model , 1980 .

[33]  J. Sethna Statistical Mechanics: Entropy, Order Parameters, and Complexity , 2021 .

[34]  Jacob Ziv,et al.  On functionals satisfying a data-processing theorem , 1973, IEEE Trans. Inf. Theory.

[35]  G. B. Bagci Some remarks on R\'{e}nyi relative entropy in a thermostatistical framework , 2007 .

[36]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[37]  Erik Ordentlich,et al.  Universal portfolios with side information , 1996, IEEE Trans. Inf. Theory.

[38]  G. B. Bagci The Physical Meaning of Renyi Relative Entropies , 2007 .

[39]  W. Marsden I and J , 2012 .

[40]  Kenneth Rose,et al.  A mapping approach to rate-distortion computation and analysis , 1994, IEEE Trans. Inf. Theory.

[41]  A. H. W. Beck,et al.  Statistical mechanics, fluctuations and noise , 1976 .