A central result that arose in applying information theory to the stochastic thermodynamics of nonlinear dynamical systems is the information-processing second law (IPSL): the physical entropy of the Universe can decrease if compensated by the Shannon-Kolmogorov-Sinai entropy change of appropriate information-carrying degrees of freedom. In particular, the asymptotic-rate IPSL precisely delineates the thermodynamic functioning of autonomous Maxwellian demons and information engines. How do these systems begin to function as engines, Landauer erasers, and error correctors? We identify a minimal, and thus inescapable, transient dissipation of physical information processing, which is not captured by asymptotic rates, but is critical to adaptive thermodynamic processes such as those found in biological systems. A component of transient dissipation, we also identify an implementation-dependent cost that varies from one physical substrate to another for the same information processing task. Applying these results to producing structured patterns from a structureless information reservoir, we show that "retrodictive" generators achieve the minimal costs. The results establish the thermodynamic toll imposed by a physical system's structure as it comes to optimally transduce information.
[1]
Thomas M. Cover,et al.
Elements of Information Theory
,
2005
.
[2]
Raymond W. Yeung,et al.
Information Theory and Network Coding
,
2008
.
[3]
J. Rogers.
Chaos
,
1876
.
[4]
M. Gell-Mann,et al.
Physics Today.
,
1966,
Applied optics.
[5]
Paul M. Riechers,et al.
Exact Results Regarding the Physics of Complex Systems via Linear Algebra, Hidden Markov Models, and Information Theory
,
2016
.
[6]
Jeffrey D. Ullman,et al.
Introduction to Automata Theory, Languages and Computation
,
1979
.
[7]
Ya. G. Sinai,et al.
On the Notion of Entropy of a Dynamical System
,
2010
.
[8]
Christopher Jarzynski,et al.
Analysis of slow transitions between nonequilibrium steady states
,
2015,
1507.06269.