Above and Beyond the Landauer Bound: Thermodynamics of Modularity

Information processing typically occurs via the composition of modular units, such as universal logic gates. The benefit of modular information processing, in contrast to globally integrated information processing, is that complex global computations are more easily and flexibly implemented via a series of simpler, localized information processing operations which only control and change local degrees of freedom. We show that, despite these benefits, there are unavoidable thermodynamic costs to modularity—costs that arise directly from the operation of localized processing and that go beyond Landauer’s dissipation bound for erasing information. Integrated computations can achieve Landauer’s bound, however, when they globally coordinate the control of all of an information reservoir’s degrees of freedom. Unfortunately, global correlations among the information-bearing degrees of freedom are easily lost by modular implementations. This is costly since such correlations are a thermodynamic fuel. We quantify the minimum irretrievable dissipation of modular computations in terms of the difference between the change in global nonequilibrium free energy, which captures these global correlations, and the local (marginal) change in nonequilibrium free energy, which bounds modular work production. This modularity dissipation is proportional to the amount of additional work required to perform the computational task modularly. It has immediate consequences for physically embedded transducers, known as information ratchets. We show how to circumvent modularity dissipation by designing internal ratchet states that capture the global correlations and patterns in the ratchet’s information reservoir. Designed in this way, information ratchets match the optimum thermodynamic efficiency of globally integrated computations. Moreover, for ratchets that extract work from a structured pattern, minimized modularity dissipation means their hidden states must be predictive of their input and, for ratchets that generate a structured pattern, this means that hidden states are retrodictive.

[1]  John R. Koza,et al.  Genetic programming - on the programming of computers by means of natural selection , 1993, Complex adaptive systems.

[2]  Eörs Szathmáry,et al.  The Major Transitions in Evolution , 1997 .

[3]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[4]  Eld,et al.  Identifying functional thermodynamics in autonomous Maxwellian ratchets , 2016 .

[5]  Gernot Schaller,et al.  Thermodynamics of stochastic Turing machines , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.

[6]  Marvin Minsky,et al.  Computation : finite and infinite machines , 2016 .

[7]  Neri Merhav,et al.  Sequence complexity and work extraction , 2015, ArXiv.

[8]  T. Sagawa,et al.  Thermodynamics of information , 2015, Nature Physics.

[9]  James P. Crutchfield,et al.  Information Symmetries in Irreversible Processes , 2011, Chaos.

[10]  Claude E. Shannon,et al.  A Universal Turing Machine with Two Internal States , 1956 .

[11]  James P. Crutchfield,et al.  Leveraging Environmental Correlations: The Thermodynamics of Requisite Variety , 2016, ArXiv.

[12]  James P. Crutchfield,et al.  Correlation-powered Information Engines and the Thermodynamics of Self-Correction , 2016, Physical review. E.

[13]  Nathan Crilly,et al.  From modularity to emergence: a primer on the design and science of complex systems , 2016 .

[14]  Jayne Thompson,et al.  When is simpler thermodynamically better , 2015 .

[15]  U. Alon An introduction to systems biology : design principles of biological circuits , 2019 .

[16]  Susanne Still Thermodynamic cost and benefit of data representations , 2017 .

[17]  James P. Crutchfield,et al.  Computational Mechanics of Input-Output Processes: Structured transformations and the ε-transducer , 2014, ArXiv.

[18]  Neri Merhav Relations Between Work and Entropy Production for General Information-Driven, Finite-State Engines , 2016, ArXiv.

[19]  J. Crutchfield,et al.  Regularities unseen, randomness observed: levels of entropy convergence. , 2001, Chaos.

[20]  Gregory S. Hornby,et al.  An Evolved Antenna for Deployment on NASA's Space Technology 5 Mission , 2004 .

[21]  Christopher Jarzynski,et al.  Work and information processing in a solvable model of Maxwell’s demon , 2012, Proceedings of the National Academy of Sciences.

[22]  R. Landauer,et al.  Irreversibility and heat generation in the computing process , 1961, IBM J. Res. Dev..

[23]  Zhiyue Lu,et al.  Engineering Maxwell's demon , 2014 .

[24]  James P. Crutchfield,et al.  Transient Dissipation and Structural Costs of Physical Information Transduction , 2017, Physical review letters.

[25]  A. Turing On Computable Numbers, with an Application to the Entscheidungsproblem. , 1937 .

[26]  Lloyd,et al.  Use of mutual information to decrease entropy: Implications for the second law of thermodynamics. , 1989, Physical review. A, General physics.

[27]  J. Crutchfield The calculi of emergence: computation, dynamics and induction , 1994 .

[28]  T. Toffoli,et al.  Conservative logic , 2002, Collision-Based Computing.

[29]  J. Glenn Brookshear,et al.  Theory of Computation: Formal Languages, Automata, and Complexity , 1989 .