Kolmogorov-Sinai entropy and dissipation in driven classical Hamiltonian systems

Many connections between physics and information theory have been revealed since the development of classical information theory by Shannon. A key concept in this connection is entropy, which represents the amount of information transferred to the observer who performs measurements in an experiment. Statistical mechanics is a physical theory deeply connected to information by Jaynes’ Maximum Entropy principle, which defines equilibrium probability distributions as the ones that maximizes entropy under some physical constraints. In this way, these distributions are the less unbiased probabilities that can be assignment to an event. Following this path, the dissipated energy in a classical Hamiltonian process (also known as the thermodynamic entropy production) was connected to the relative entropy between the forward and backward probability densities. A recent work by Still et al. has revealed that energetic inefficiency and model inefficiency are equivalent concepts in Markovian processes, where the latter is defined as the difference in mutual information that the system’s state shares with the future and past environmental variables. This raises the question whether model unpredictability and energetic inefficiency are connected in the framework of classical physics. The aim of this study is to connect the concepts of random behavior of a classical Hamiltonian system with its energetic inefficiency. The random behavior of a classical system is quantified by the Kolmogorov-Sinai entropy associated with its dynamics, an information-theoretic approach to chaos, whereas energetic inefficiency is measured by the dissipated work.

[1]  M. Paternostro,et al.  Non-Markovian quantum processes: Complete framework and efficient characterization , 2015, 1512.00589.

[2]  G. Crooks Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences. , 1999, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[3]  Markus Müller,et al.  Entropy and Quantum Kolmogorov Complexity: A Quantum Brudno’s Theorem , 2005, ArXiv.

[4]  V. I. Arnolʹd,et al.  Ergodic problems of classical mechanics , 1968 .

[5]  C. Jarzynski Equalities and Inequalities: Irreversibility and the Second Law of Thermodynamics at the Nanoscale , 2011 .

[6]  Thomas de Quincey [C] , 2000, The Works of Thomas De Quincey, Vol. 1: Writings, 1799–1820.

[7]  Paul Skrzypczyk,et al.  The role of quantum information in thermodynamics—a topical review , 2015, 1505.07835.

[8]  D. Ruelle,et al.  Ergodic theory of chaos and strange attractors , 1985 .

[9]  Susanne Still,et al.  Optimal causal inference: estimating stored information and approximating causal architecture. , 2007, Chaos.

[10]  Susanne Still,et al.  Information-theoretic approach to interactive learning , 2007, 0709.1948.

[11]  Susanne Still,et al.  The thermodynamics of prediction , 2012, Physical review letters.

[12]  Rolf Landauer,et al.  Irreversibility and heat generation in the computing process , 1961, IBM J. Res. Dev..

[13]  E. Jaynes Information Theory and Statistical Mechanics , 1957 .

[14]  C. Jarzynski,et al.  Quantum-Classical Correspondence Principle for Work Distributions , 2015, 1507.05763.

[15]  J. Parrondo,et al.  Dissipation: the phase-space perspective. , 2007, Physical review letters.

[16]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[17]  P Grigolini,et al.  Trajectory versus probability density entropy. , 2001, Physical review. E, Statistical, nonlinear, and soft matter physics.

[18]  Göran Lindblad,et al.  Non-Markovian quantum stochastic processes and their entropy , 1979 .

[19]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[20]  W. H. Zurek,et al.  Thermodynamic cost of computation, algorithmic complexity and the information metric , 1989, Nature.

[21]  P. Talkner,et al.  Colloquium: Quantum fluctuation relations: Foundations and applications , 2010, 1012.2268.

[22]  Roman Frigg,et al.  In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour?—Bridging the Gap Between Dynamical Systems Theory and Communication Theory , 2004, The British Journal for the Philosophy of Science.

[23]  V. Latora,et al.  Kolmogorov-Sinai Entropy Rate versus Physical Entropy , 1998, chao-dyn/9806006.

[24]  C. Jarzynski Nonequilibrium Equality for Free Energy Differences , 1996, cond-mat/9610209.

[25]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[26]  Angelo Vulpiani,et al.  Production rate of the coarse-grained Gibbs entropy and the Kolmogorov-Sinai entropy: a real connection? , 2004, Physical review. E, Statistical, nonlinear, and soft matter physics.

[27]  Matt Visser,et al.  Coarse Graining Shannon and von Neumann Entropies , 2017, Entropy.

[28]  Evans,et al.  Probability of second law violations in shearing steady states. , 1993, Physical review letters.