Inferring entropy from structure.

The thermodynamic definition of entropy can be extended to nonequilibrium systems based on its relation to information. To apply this definition in practice requires access to the physical system's microstates, which may be prohibitively inefficient to sample or difficult to obtain experimentally. It is beneficial, therefore, to relate the entropy to other integrated properties which are accessible out of equilibrium. We focus on the structure factor, which describes the spatial correlations of density fluctuations and can be directly measured by scattering. The information gained by a given structure factor regarding an otherwise unknown system provides an upper bound for the system's entropy. We find that the maximum-entropy model corresponds to an equilibrium system with an effective pair interaction. Approximate closed-form relations for the effective pair potential and the resulting entropy in terms of the structure factor are obtained. As examples, the relations are used to estimate the entropy of an exactly solvable model and two simulated systems out of equilibrium. The focus is on low-dimensional examples, where our method, as well as a recently proposed compression-based one, can be tested against a rigorous direct-sampling technique. The entropy inferred from the structure factor is found to be consistent with the other methods, superior for larger system sizes, and accurate in identifying global transitions. Our approach allows for extensions of the theory to more complex systems and to higher-order correlations.

[1]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[2]  H. S. Green,et al.  A general kinetic theory of liquids. II. Equilibrium properties , 1947, Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences.

[3]  E. Jaynes Information Theory and Statistical Mechanics , 1957 .

[4]  R. E. Nettleton,et al.  Expression in Terms of Molecular Distribution Functions for the Entropy Density in an Infinite System , 1958 .

[5]  F. Reif,et al.  Fundamentals of Statistical and Thermal Physics , 1965 .

[6]  R. L. Henderson A uniqueness theorem for fluid pair correlation functions , 1974 .

[7]  I. R. Mcdonald,et al.  Theory of simple liquids , 1998 .

[8]  J. Hernando Thermodynamic potentials and distribution functions. II: The HNC equation as an optimized superposition approximation , 1990 .

[9]  Laird,et al.  Calculation of the entropy from multiparticle correlation functions. , 1992, Physical review. A, Atomic, molecular, and optical physics.

[10]  Thomas L. Griffiths,et al.  Advances in Neural Information Processing Systems 21 , 1993, NIPS 2009.

[11]  L. Györfi,et al.  Nonparametric entropy estimation. An overview , 1997 .

[12]  Igor Vajda,et al.  Estimation of the Information by an Adaptive Partitioning of the Observation Space , 1999, IEEE Trans. Inf. Theory.

[13]  Chong-Ho Choi,et al.  Input Feature Selection by Mutual Information Based on Parzen Window , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Liam Paninski,et al.  Estimation of Entropy and Mutual Information , 2003, Neural Computation.

[15]  D. Jou,et al.  Temperature in non-equilibrium states: a review of open problems and current proposals , 2003 .

[16]  King-Sun Fu,et al.  IEEE Transactions on Pattern Analysis and Machine Intelligence Publication Information , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Jacob Goldberger,et al.  ICA based on a Smooth Estimation of the Differential Entropy , 2008, NIPS.

[18]  J. Gollub,et al.  Random organization in periodically driven systems , 2008 .

[19]  E. Katzav,et al.  The ideas behind self-consistent expansion , 2007, 0711.2795.

[20]  Renato Vicente,et al.  An information-theoretic approach to statistical dependence: Copula information , 2009, ArXiv.

[21]  Dan Stowell,et al.  Fast Multidimensional Entropy Estimation by $k$-d Partitioning , 2009, IEEE Signal Processing Letters.

[22]  C. Sempi,et al.  Copula Theory: An Introduction , 2010 .

[23]  Fabrizio Durante,et al.  Copula Theory and Its Applications , 2010 .

[24]  Kai Yu,et al.  Feature Selection for Gene Expression Using Model-Based Entropy , 2010, IEEE/ACM Transactions on Computational Biology and Bioinformatics.

[25]  Driss Aboutajdine,et al.  Textural feature selection by joint mutual information based on Gaussian mixture model for multispectral image classification , 2010, Pattern Recognit. Lett..

[26]  Yu. D. Fomin,et al.  Breakdown of excess entropy scaling for systems with thermodynamic anomalies. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[27]  L. Cugliandolo The effective temperature , 2011, 1104.4901.

[28]  P Ronhovde,et al.  Detecting hidden spatial and spatio-temporal structures in glasses and complex physical systems by multiresolution network clustering , 2011, The European physical journal. E, Soft matter.

[29]  Z. Nussinov,et al.  High-temperature correlation functions: Universality, extraction of exchange interactions, divergent correlation lengths, and generalized Debye length scales , 2010, 1008.2964.

[30]  C. Jarzynski Equalities and Inequalities: Irreversibility and the Second Law of Thermodynamics at the Nanoscale , 2011 .

[31]  U. Seifert Stochastic thermodynamics, fluctuation theorems and molecular machines , 2012, Reports on progress in physics. Physical Society.

[32]  P. Embrechts,et al.  STATISTICAL INFERENCE FOR COPULAS IN HIGH DIMENSIONS: A SIMULATION STUDY , 2013, ASTIN Bulletin.

[33]  S. Ramaswamy,et al.  Hydrodynamics of soft active matter , 2013 .

[34]  Maricel G. Kann,et al.  IEEE/ACM Transactions on Computational Biology and Bioinformatics , 2013 .

[35]  D. Levine,et al.  Hyperuniformity of critical absorbing states. , 2014, Physical review letters.

[36]  E Estevez-Rams,et al.  Lempel-Ziv complexity analysis of one dimensional cellular automata. , 2015, Chaos.

[37]  T. Sagawa,et al.  Thermodynamics of information , 2015, Nature Physics.

[38]  M. Kardar,et al.  Pressure is not a state function for generic active fluids , 2014, Nature Physics.

[39]  A. Hartmann,et al.  Analysis of the phase transition in the two-dimensional Ising ferromagnet using a Lempel-Ziv string-parsing scheme and black-box data-compression utilities. , 2014, Physical review. E, Statistical, nonlinear, and soft matter physics.

[40]  P. Chaikin,et al.  Enhanced hyperuniformity from random reorganization , 2017, Proceedings of the National Academy of Sciences.

[41]  R. Zecchina,et al.  Inverse statistical problems: from the inverse Ising problem to data science , 2017, 1702.01522.

[42]  S. Torquato Hyperuniform states of matter , 2018, Physics Reports.

[43]  Jie Sun,et al.  Geometric k-nearest neighbor estimation of entropy and mutual information , 2017, Chaos.

[44]  Dov Levine,et al.  Quantifying Hidden Order out of Equilibrium , 2017, Physical Review X.

[45]  Ram Avinery,et al.  Universal and Accessible Entropy Estimation Using a Compression Algorithm. , 2017, Physical review letters.

[46]  D. Frenkel,et al.  Information density, structure and entropy in equilibrium and non-equilibrium systems , 2019, Journal of Statistical Mechanics: Theory and Experiment.

[47]  Estimating Differential Entropy using Recursive Copula Splitting , 2019, Entropy.

[48]  S. Torquato,et al.  Realizable hyperuniform and nonhyperuniform particle configurations with targeted spectral functions via effective pair interactions. , 2020, Physical review. E.

[49]  S. Dattagupta Stochastic Thermodynamics , 2021, Resonance.