Some Observations on the Concepts of Information-Theoretic Entropy and Randomness

Jonathan D.H. SmithDepartment of MathematicsIowa State UniversityAmes, IA 50011, USAE-mail: jdhsmith@math.iastate.eduURL: http://www.math.iastate.edu/jdhsmith/Received: 15 February 2000 / Accepted: 11 January 2001 / Published: 1 February 2001Abstract: Certain aspects of the history, derivation, and physical application of theinformation-theoretic entropy concept are discussed. Pre-dating Shannon, the conceptis traced back to Pauli. A derivation from rst principles is given, without use ofapproximations. The concept depends on the underlying degree of randomness. Inphysical applications, this translates to dependence on the experimental apparatusavailable. An example illustrates how this dependence a ects Prigogine’s proposal forthe use of the Second Law of Thermodynamics as a selection principle for the breakingof time symmetry. The dependence also serves to yield a resolution of the so-called\Gibbs Paradox." Extension of the concept from the discrete to the continuous caseis discussed. The usual extension is shown to be dimensionally incorrect. Correctionintroduces a reference density, leading to the concept of Kullback entropy. Practicalrelativistic considerations suggest a possible proper reference density.Keywords: information-theoretic entropy; Shannon entropy; Martin-Loef random-ness; self-delimiting algorithmic complexity; thermodynamic entropy; Second Law ofThermodynamics; selection principle; wave equation; Gibbs Paradox; dimensional anal-ysis; Kullback entropy; cross-entropy; reference density; improper prior.c 2001 by the author. Reproduction for noncommercial purposes permitted.

[1]  I. Prigogine,et al.  The second law as a selection principle: The microscopic theory of dissipative processes in quantum systems. , 1983, Proceedings of the National Academy of Sciences of the United States of America.

[2]  V. Uspenskii,et al.  Can an individual sequence of zeros and ones be random? Russian Math , 1990 .

[3]  Per Martin-Löf,et al.  The Definition of Random Sequences , 1966, Inf. Control..

[4]  Edward L. O'Neill,et al.  Introduction to Statistical Optics , 1963 .

[5]  E. Jaynes Information Theory and Statistical Mechanics , 1957 .

[6]  F. Roddier,et al.  Introduction to Statistical Optics , 1994 .

[7]  Ming Li,et al.  An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.

[8]  P. Pearce PRINCIPLES OF STATISTICAL MECHANICS , 1998 .

[9]  R. Leighton,et al.  Feynman Lectures on Physics , 1971 .

[10]  M. De Handbuch der Physik , 1957 .

[11]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.