Prediction and Power in Molecular Sensors: Uncertainty and Dissipation When Conditionally Markovian Channels Are Driven by Semi-Markov Environments

Sensors often serve at least two purposes: predicting their input and minimizing dissipated heat. However, determining whether or not a particular sensor is evolved or designed to be accurate and efficient is difficult. This arises partly from the functional constraints being at cross purposes and partly since quantifying the predictive performance of even in silico sensors can require prohibitively long simulations. To circumvent these difficulties, we develop expressions for the predictive accuracy and thermodynamic costs of the broad class of conditionally Markovian sensors subject to unifilar hidden semi-Markov (memoryful) environmental inputs. Predictive metrics include the instantaneous memory and the mutual information between present sensor state and input future, while dissipative metrics include power consumption and the nonpredictive information rate. Success in deriving these formulae relies heavily on identifying the environment's causal states, the input's minimal sufficient statistics for prediction. Using these formulae, we study the simplest nontrivial biological sensor model---that of a Hill molecule, characterized by the number of ligands that bind simultaneously, the sensor's cooperativity. When energetic rewards are proportional to total predictable information, the closest cooperativity that optimizes the total energy budget generally depends on the environment's past hysteretically. In this way, the sensor gains robustness to environmental fluctuations. Given the simplicity of the Hill molecule, such hysteresis will likely be found in more complex predictive sensors as well. That is, adaptations that only locally optimize biochemical parameters for prediction and dissipation can lead to sensors that "remember" the past environment.

[1]  M Mitchell,et al.  The evolution of emergent computation. , 1995, Proceedings of the National Academy of Sciences of the United States of America.

[2]  Young,et al.  Inferring statistical complexity. , 1989, Physical review letters.

[3]  Susanne Still,et al.  Information-theoretic approach to interactive learning , 2007, 0709.1948.

[4]  James P. Crutchfield,et al.  Leveraging Environmental Correlations: The Thermodynamics of Requisite Variety , 2016, ArXiv.

[5]  M. Sano,et al.  Experimental demonstration of information-to-energy conversion and validation of the generalized Jarzynski equality , 2010 .

[6]  James P. Crutchfield,et al.  Correlation-powered Information Engines and the Thermodynamics of Self-Correction , 2016, Physical review. E.

[7]  Sosuke Ito,et al.  Information thermodynamics on causal networks. , 2013, Physical review letters.

[8]  A. U.S.,et al.  Predictability , Complexity , and Learning , 2002 .

[9]  James P. Crutchfield,et al.  Information Flows? A Critique of Transfer Entropies , 2015, Physical review letters.

[10]  Wolfgang Löhr,et al.  Models of Discrete-Time Stochastic Processes and Associated Complexity Measures , 2009 .

[11]  D. Jou,et al.  Temperature in non-equilibrium states: a review of open problems and current proposals , 2003 .

[12]  Yuhai Tu,et al.  The energy-speed-accuracy tradeoff in sensory adaptation , 2012, Nature Physics.

[13]  Aleksandra M. Walczak,et al.  Trade-Offs in Delayed Information Transmission in Biochemical Networks , 2015, 1504.03637.

[14]  Gasper Tkacik,et al.  Optimizing information flow in small genetic networks. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[15]  Jordan M Horowitz,et al.  Imitating chemical motors with optimal information motors. , 2012, Physical review letters.

[16]  David J Schwab,et al.  Energetic costs of cellular computation , 2012, Proceedings of the National Academy of Sciences.

[17]  Udo Seifert,et al.  Sensory capacity: An information theoretical measure of the performance of a sensor. , 2015, Physical review. E.

[18]  D. Lathrop Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering , 2015 .

[19]  Felix Creutzig,et al.  Predictive Coding and the Slowness Principle: An Information-Theoretic Approach , 2008, Neural Computation.

[20]  Peter S. Swain,et al.  Trade-Offs and Constraints in Allosteric Sensing , 2011, PLoS Comput. Biol..

[21]  James P. Crutchfield,et al.  Bayesian Structural Inference for Hidden Processes , 2013, Physical review. E, Statistical, nonlinear, and soft matter physics.

[22]  Mathieu Génois,et al.  Temporal Gillespie Algorithm: Fast Simulation of Contagion Processes on Time-Varying Networks , 2015, PLoS Comput. Biol..

[23]  James P. Crutchfield,et al.  Anatomy of a Bit: Information in a Time Series Observation , 2011, Chaos.

[24]  James P. Crutchfield,et al.  Thermodynamics of Random Number Generation , 2016, Physical review. E.

[25]  Sebastian Goldt,et al.  Stochastic thermodynamics of learning , 2016, Physical review letters.

[26]  Mikhail Prokopenko,et al.  Entropy balance and information processing in bipartite and nonbipartite composite systems , 2017, Physical Review E.

[27]  Rajesh P. N. Rao,et al.  Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects. , 1999 .

[28]  James P. Crutchfield,et al.  Equivalence of History and Generator-Machines , 2012 .

[29]  P. R. ten Wolde,et al.  Biochemical Machines for the Interconversion of Mutual Information and Work. , 2017, Physical review letters.

[30]  W. Bialek,et al.  Optimizing information flow in small genetic networks. II. Feed-forward interactions. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[31]  Sosuke Ito,et al.  Maxwell's demon in biochemical signal transduction with feedback loop , 2014, Nature Communications.

[32]  Christopher Jarzynski,et al.  Work and information processing in a solvable model of Maxwell’s demon , 2012, Proceedings of the National Academy of Sciences.

[33]  James P. Crutchfield,et al.  Nearly Maximally Predictive Features and Their Dimensions , 2017, Physical review. E.

[34]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[35]  James P. Crutchfield,et al.  Transient Dissipation and Structural Costs of Physical Information Transduction , 2017, Physical review letters.

[36]  Sarah Marzen,et al.  The difference between memory and prediction in linear recurrent networks , 2017, Physical review. E.

[37]  Marco Del Giudice,et al.  Thermodynamic limits to information harvesting by sensory systems , 2014, 1408.5128.

[38]  David Pfau,et al.  Probabilistic Deterministic Infinite Automata , 2010, NIPS.

[39]  Thomas E. Ouldridge,et al.  What we learn from the learning rate , 2017, 1702.06041.

[40]  D. Chklovskii,et al.  Maps in the brain: what can we learn from them? , 2004, Annual review of neuroscience.

[41]  James P. Crutchfield,et al.  Optimized Bacteria are Environmental Prediction Engines , 2018, Physical review. E.

[42]  Daniel Polani,et al.  Information Theory of Decisions and Actions , 2011 .

[43]  Raymond W. Yeung,et al.  A new outlook of Shannon's information measures , 1991, IEEE Trans. Inf. Theory.

[44]  Charles H. Bennett,et al.  The thermodynamics of computation—a review , 1982 .

[45]  Sarah Marzen,et al.  Infinitely large, randomly wired sensors cannot predict their input unless they are close to deterministic , 2018, PloS one.

[46]  J. Victor Binless strategies for estimation of information from neural data. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[47]  Michael Hinczewski,et al.  Cellular signaling networks function as generalized Wiener-Kolmogorov filters to suppress noise , 2014, 1406.3290.

[48]  Michael J. Berry,et al.  Predictive information in a sensory population , 2013, Proceedings of the National Academy of Sciences.

[49]  Wolfgang Löhr Predictive models and generative complexity , 2012, J. Syst. Sci. Complex..

[50]  T. Sejnowski,et al.  Metabolic cost as a unifying principle governing neuronal biophysics , 2010, Proceedings of the National Academy of Sciences.

[51]  Massimiliano Esposito,et al.  Ensemble and trajectory thermodynamics: A brief introduction , 2014, 1403.1777.

[52]  Sarah Marzen,et al.  Statistical mechanics of Monod-Wyman-Changeux (MWC) models. , 2013, Journal of molecular biology.

[53]  A. Miyake,et al.  How an autonomous quantum Maxwell demon can harness correlated information. , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.

[54]  Andre C. Barato,et al.  Stochastic thermodynamics of bipartite systems: transfer entropy inequalities and a Maxwell’s demon interpretation , 2014 .

[55]  M. N. Bera,et al.  Thermodynamics from Information , 2018, 1805.10282.

[56]  Jordan M. Horowitz,et al.  Thermodynamic Costs of Information Processing in Sensory Adaptation , 2014, PLoS Comput. Biol..

[57]  A. B. Boyd,et al.  Maxwell Demon Dynamics: Deterministic Chaos, the Szilard Map, and the Intelligence of Thermodynamic Systems. , 2015, Physical review letters.

[58]  Friedrich T. Sommer,et al.  Learning and exploration in action-perception loops , 2013, Front. Neural Circuits.

[59]  Thierry Mora,et al.  Thermodynamics of statistical inference by cells. , 2014, Physical review letters.

[60]  Andrew Mugler,et al.  Optimal Prediction by Cellular Signaling Networks. , 2015, Physical review letters.

[61]  Sarah Marzen,et al.  Weak universality in sensory tradeoffs. , 2016, Physical review. E.

[62]  Susanne Still,et al.  The thermodynamics of prediction , 2012, Physical review letters.

[63]  Pieter Rein ten Wolde,et al.  Energy dissipation and noise correlations in biochemical sensing. , 2014, Physical review letters.

[64]  Susanne Still,et al.  Optimal causal inference: estimating stored information and approximating causal architecture. , 2007, Chaos.

[65]  James P Crutchfield,et al.  Time's barbed arrow: irreversibility, crypticity, and stored information. , 2009, Physical review letters.

[66]  R. Landauer,et al.  Irreversibility and heat generation in the computing process , 1961, IBM J. Res. Dev..

[67]  James P. Crutchfield,et al.  Computational Mechanics: Pattern and Prediction, Structure and Simplicity , 1999, ArXiv.

[68]  Richard S. Sutton,et al.  Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.

[69]  J. Crutchfield The calculi of emergence: computation, dynamics and induction , 1994 .

[70]  James P. Crutchfield,et al.  Structure and Randomness of Continuous-Time, Discrete-Event Processes , 2017, ArXiv.

[71]  Andre C. Barato,et al.  Efficiency of cellular information processing , 2014, 1405.7241.

[72]  D. Gillespie A General Method for Numerically Simulating the Stochastic Time Evolution of Coupled Chemical Reactions , 1976 .

[73]  Jordan M. Horowitz,et al.  Thermodynamics with Continuous Information Flow , 2014, 1402.3276.

[74]  Surya Ganguli,et al.  A universal tradeoff between power, precision and speed in physical communication , 2016, ArXiv.

[75]  G. Iyengar,et al.  A universal lower bound on the free energy cost of molecular measurements , 2016, 1608.07663.

[76]  William Bialek,et al.  Entropy and Inference, Revisited , 2001, NIPS.

[77]  L. Szilard On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings. , 1964, Behavioral science.

[78]  C. Jarzynski,et al.  Information Processing and the Second Law of Thermodynamics: An Inclusive Hamiltonian Approach. , 2013, 1308.5001.

[79]  Naftali Tishby,et al.  Past-future information bottleneck in dynamical systems. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.