Concentration Theorems for Entropy and Free Energy

AbstractJaynes’s entropy concentration theorem states that, for most words ω1 ...ωN of length N such that $$\mathop \Sigma \limits_{i = 1}^{\rm N} \;f(\omega _i ) \approx vN$$ , empirical frequencies of values of a function f are close to the probabilities that maximize the Shannon entropy given a value v of the mathematical expectation of f. Using the notion of algorithmic entropy, we define the notions of entropy for the Bose and Fermi statistical models of unordered data. New variants of Jaynes’s concentration theorem for these models are proved. We also present some concentration properties for free energy in the case of a nonisolated isothermal system. Exact relations for the algorithmic entropy and free energy at extreme points are obtained. These relations are used to obtain tight bounds on uctuations of energy levels at equilibrium points.

[1]  Виктор Павлович Маслов,et al.  Интегральные уравнения и фазовые переходы в вероятностных играх. Аналогия со статистической физикой@@@Integral Equations and Phase Transitions in Stochastic Games. An Analogy with Statistical Physics , 2003 .

[2]  V. Uspenskii,et al.  Can an individual sequence of zeros and ones be random? Russian Math , 1990 .

[3]  A. N. Kolmogorov Combinatorial foundations of information theory and the calculus of probabilities , 1983 .

[4]  Vladimir V. V'yugin,et al.  Algorithmic Complexity and Stochastic Properties of Finite Binary Sequences , 1999, Comput. J..

[5]  E. T. Jaynes,et al.  Papers on probability, statistics and statistical physics , 1983 .

[6]  Péter Gács,et al.  Algorithmic statistics , 2000, IEEE Trans. Inf. Theory.

[7]  Vladimir V. V'yugin,et al.  Extremal Relations between Additive Loss Functions and the Kolmogorov Complexity , 2003, Probl. Inf. Transm..

[8]  Zurek,et al.  Algorithmic randomness and physical entropy. , 1989, Physical review. A, General physics.

[9]  A. Kolmogorov,et al.  ALGORITHMS AND RANDOMNESS , 1988 .

[10]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[11]  A. Kolmogorov Three approaches to the quantitative definition of information , 1968 .

[12]  Andrei N. Kolmogorov,et al.  Logical basis for information theory and probability theory , 1968, IEEE Trans. Inf. Theory.

[13]  Ming Li,et al.  An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.