Extending Landauer's Bound from Bit Erasure to Arbitrary Computation

Recent analyses have calculated the minimal thermodynamic work required to perform a computation pi when two conditions hold: the output of pi is independent of its input (e.g., as in bit erasure); we use a physical computer C to implement pi that is specially tailored to the environment of C, i.e., to the precise distribution over C's inputs, P_0. First I extend these analyses to calculate the work required even if the output of pi depends on its input, and even if C is not used with the distribution P_0 it was tailored for. Next I show that if C will be re-used, then the minimal work to run it depends only on the logical computation pi, independent of the physical details of C. This establishes a formal identity between the thermodynamics of (re-usable) computers and theoretical computer science. I use this identity to prove that the minimal work required to compute a bit string sigma on a "general purpose computer" rather than a special purpose one, i.e., on a universal Turing machine U, is k_BT ln(2) times the sum of three terms: The Kolmogorov complexity of sigma, log of the Bernoulli measure of the set of strings that compute sigma, and log of the halting probability of U. I also prove that using C with a distribution over environments results in an unavoidable increase in the work required to run the computer, even if it is tailored to the distribution over environments. I end by using these results to relate the free energy flux incident on an organism / robot / biosphere to the maximal amount of computation that the organism / robot / biosphere can do per unit time.

[1]  Susanne Still,et al.  The thermodynamics of prediction , 2012, Physical review letters.

[2]  M. B. Plenio,et al.  The physics of forgetting: Landauer's erasure principle and information theory , 2001, quant-ph/0103108.

[3]  R. Landauer,et al.  Irreversibility and heat generation in the computing process , 1961, IBM J. Res. Dev..

[4]  L. Goddard Information Theory , 1962, Nature.

[5]  Dean J. Driebe,et al.  Generalization of the second law for a transition between nonequilibrium states , 2010 .

[6]  Farid Chejne,et al.  A simple derivation of crooks relation , 2013 .

[7]  J. Koski,et al.  Experimental realization of a Szilard engine with a single electron , 2014, Proceedings of the National Academy of Sciences.

[8]  Mikhail Prokopenko,et al.  On Thermodynamic Interpretation of Transfer Entropy , 2013, Entropy.

[9]  S. Lloyd Ultimate physical limits to computation , 1999, Nature.

[10]  C. Jarzynski Nonequilibrium Equality for Free Energy Differences , 1996, cond-mat/9610209.

[11]  U. Seifert Stochastic thermodynamics, fluctuation theorems and molecular machines , 2012, Reports on progress in physics. Physical Society.

[12]  Dmitri Petrov,et al.  Universal features in the energetics of symmetry breaking , 2013, Nature Physics.

[13]  Masahito Ueda,et al.  Minimal energy cost for thermodynamic information processing: measurement and information erasure. , 2008, Physical review letters.

[14]  Ming Li,et al.  An Introduction to Kolmogorov Complexity and Its Applications , 1997, Texts in Computer Science.

[15]  Lloyd Use of mutual information to decrease entropy: Implications for the second law of thermodynamics. , 1989, Physical review. A, General physics.

[16]  E. Lutz,et al.  Experimental verification of Landauer’s principle linking information and thermodynamics , 2012, Nature.

[17]  O. Maroney Generalizing Landauer's principle. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[18]  L. Brillouin,et al.  Science and information theory , 1956 .

[19]  H. Hasegawa,et al.  Generalization of the Second Law for a Nonequilibrium Initial State , 2009, 0907.1569.

[20]  M. Esposito,et al.  Three faces of the second law. I. Master equation formulation. , 2010, Physical review. E, Statistical, nonlinear, and soft matter physics.

[21]  Thermodynamics: Engines and demons , 2014 .

[22]  Eric Lutz,et al.  Comment on "Minimal energy cost for thermodynamic information processing: measurement and information erasure". , 2010, Physical review letters.

[23]  Mikhail Prokopenko,et al.  Information thermodynamics of near-equilibrium computation. , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.

[24]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[25]  E. Fredkin Digital mechanics: an informational process based on reversible universal cellular automata , 1990 .

[26]  Seth Lloyd,et al.  Information-theoretic approach to the study of control systems , 2001, physics/0104007.

[27]  Karoline Wiesner,et al.  Information-theoretic lower bound on energy cost of stochastic computation , 2011, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[28]  Johan Aberg,et al.  The thermodynamic meaning of negative entropy , 2011, Nature.

[29]  Thermodynamics of information: Bits for less or more for bits? , 2010 .

[30]  G. Crooks Nonequilibrium Measurements of Free Energy Differences for Microscopically Reversible Markovian Systems , 1998 .

[31]  T. Sagawa Thermodynamic and logical reversibilities revisited , 2013, 1311.1886.

[32]  Masahito Ueda,et al.  Fluctuation theorem with information exchange: role of correlations in stochastic thermodynamics. , 2012, Physical review letters.

[33]  Charles H. Bennett,et al.  Logical reversibility of computation , 1973 .

[34]  Shizume Heat generation required by information erasure. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[35]  Blake S. Pollard A Second Law for Open Markov Processes , 2016, Open Syst. Inf. Dyn..

[36]  L. Szilard On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings. , 1964, Behavioral science.

[37]  Mikhail Prokopenko,et al.  Transfer Entropy and Transient Limits of Computation , 2014, Scientific Reports.

[38]  Zurek,et al.  Algorithmic randomness and physical entropy. , 1989, Physical review. A, General physics.