Common Probability Patterns Arise from Simple Invariances

Shift and stretch invariance lead to the exponential-Boltzmann probability distribution. Rotational invariance generates the Gaussian distribution. Particular scaling relations transform the canonical exponential and Gaussian patterns into the variety of commonly observed patterns. The scaling relations themselves arise from the fundamental invariances of shift, stretch, and rotation, plus a few additional invariances. Prior work described the three fundamental invariances as a consequence of the equilibrium canonical ensemble of statistical mechanics or the Jaynesian maximization of information entropy. By contrast, I emphasize the primacy and sufficiency of invariance alone to explain the commonly observed patterns. Primary invariance naturally creates the array of commonly observed scaling relations and associated probability patterns, whereas the classical approaches derived from statistical mechanics or information theory require special assumptions to derive commonly observed scales.

[1]  P. J. Shepherd A Course in Theoretical Physics , 2013 .

[2]  Stefan Thurner,et al.  Generalized entropies and the transformation group of superstatistics , 2011, Proceedings of the National Academy of Sciences.

[3]  D. Neuenschwander,et al.  Emmy Noether's Wonderful Theorem , 2010 .

[4]  E. Jaynes Information Theory and Statistical Mechanics , 1957 .

[5]  S. Frank The common patterns of nature , 2009, Journal of evolutionary biology.

[6]  K. Chung,et al.  Limit Distributions for Sums of Independent Random Variables. , 1955 .

[7]  W. Bryc The Normal Distribution: Characterizations with Applications , 1995 .

[8]  K. Arrow,et al.  The New Palgrave Dictionary of Economics , 2020 .

[9]  B. Gnedenko,et al.  Limit Distributions for Sums of Independent Random Variables , 1955 .

[10]  Steven A. Frank,et al.  Measurement Invariance, Entropy, and Probability , 2010, Entropy.

[11]  O. Seeberg Statistical Mechanics. — A Set of Lectures , 1975 .

[12]  Steven A. Frank,et al.  How to Read Probability Distributions as Statements About Process , 2014, Entropy.

[13]  S. Weinberg Elementary Particles and the Laws of Physics: Towards the final laws of physics , 1987 .

[14]  U. Grenander Elements of Pattern Theory , 1996 .

[15]  C. Tsallis Introduction to Nonextensive Statistical Mechanics: Approaching a Complex World , 2009 .

[16]  Steven A. Frank D'Alembert's Direct and Inertial Forces Acting on Populations: The Price Equation and the Fundamental Theorem of Natural Selection , 2015, Entropy.

[17]  B. Roy Frieden,et al.  Science from Fisher Information: A Unification , 2004 .

[18]  K. Dill,et al.  Principles of maximum entropy and maximum caliber in statistical physics , 2013 .

[19]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[20]  E. Jaynes Probability theory : the logic of science , 2003 .

[21]  How to Read Probability Distributions as Statements About Process , 2014 .

[22]  Leon M. Lederman,et al.  Symmetry and the Beautiful Universe , 2004 .

[23]  S. Frank,et al.  A simple derivation and classification of common probability distributions based on information symmetry and measurement scale , 2010, Journal of evolutionary biology.

[24]  E. M. Lifshitz,et al.  Course in Theoretical Physics , 2013 .

[25]  M. Kane Measurement theory. , 1980, NLN publications.

[26]  Richard Phillips Feynman,et al.  Elementary Particles and the Laws of Physics , 1987 .

[27]  Rodney W. Johnson,et al.  Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.