What we learn from the learning rate

The learning rate is an information-theoretical quantity for bipartite Markov chains describing two coupled subsystems. It is defined as the rate at which transitions in the downstream subsystem tend to increase the mutual information between the two subsystems, and is bounded by the dissipation arising from these transitions. Its physical interpretation, however, is unclear, although it has been used as a metric for the sensing performance of the downstream subsystem. In this paper, we explore the behaviour of the learning rate for a number of simple model systems, establishing when and how its behaviour is distinct from the instantaneous mutual information between subsystems. In the simplest case, the two are almost equivalent. In more complex steady-state systems, the mutual information and the learning rate behave qualitatively distinctly, with the learning rate clearly now reflecting the rate at which the downstream system must update its information in response to changes in the upstream system. It is not clear whether this quantity is the most natural measure for sensor performance, and, indeed, we provide an example in which optimising the learning rate over a region of parameter space of the downstream system yields an apparently sub-optimal sensor.

[1]  Andre C. Barato,et al.  Stochastic thermodynamics of bipartite systems: transfer entropy inequalities and a Maxwell’s demon interpretation , 2014 .

[2]  P. R. ten Wolde,et al.  Fundamental Limits to Cellular Sensing , 2015, 1505.06577.

[3]  Pieter Rein ten Wolde,et al.  Prediction and Dissipation in Biochemical Sensing , 2013 .

[4]  Armen E. Allahverdyan,et al.  Thermodynamic efficiency of information and heat flow , 2009, 0907.3320.

[5]  T. Ouldridge,et al.  Fundamental Costs in the Production and Destruction of Persistent Polymer Copies. , 2016, Physical review letters.

[6]  Sosuke Ito,et al.  Maxwell's demon in biochemical signal transduction with feedback loop , 2014, Nature Communications.

[7]  U. Seifert Stochastic thermodynamics, fluctuation theorems and molecular machines , 2012, Reports on progress in physics. Physical Society.

[8]  Masahito Ueda,et al.  Fluctuation theorem with information exchange: role of correlations in stochastic thermodynamics. , 2012, Physical review letters.

[9]  Andre C. Barato,et al.  Efficiency of cellular information processing , 2014, 1405.7241.

[10]  Jordan M Horowitz,et al.  Imitating chemical motors with optimal information motors. , 2012, Physical review letters.

[11]  P. R. ten Wolde,et al.  Biochemical Machines for the Interconversion of Mutual Information and Work. , 2017, Physical review letters.

[12]  Andrew Mugler,et al.  Optimal Prediction by Cellular Signaling Networks. , 2015, Physical review letters.

[13]  A. Miyake,et al.  How an autonomous quantum Maxwell demon can harness correlated information. , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.

[14]  A. C. Barato,et al.  Critical behavior of entropy production and learning rate: Ising model with an oscillating field , 2016, 1607.04602.

[15]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[16]  Jordan M. Horowitz,et al.  Thermodynamic Costs of Information Processing in Sensory Adaptation , 2014, PLoS Comput. Biol..

[17]  Marco Del Giudice,et al.  Thermodynamic limits to information harvesting by sensory systems , 2014, 1408.5128.

[18]  C. Jarzynski Equalities and Inequalities: Irreversibility and the Second Law of Thermodynamics at the Nanoscale , 2011 .

[19]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[20]  David J Schwab,et al.  Energetic costs of cellular computation , 2012, Proceedings of the National Academy of Sciences.

[21]  Udo Seifert,et al.  Sensory capacity: An information theoretical measure of the performance of a sensor. , 2015, Physical review. E.

[22]  J. Horowitz Multipartite information flow for multiple Maxwell demons , 2015, 1501.05549.

[23]  Yuhai Tu,et al.  The energy-speed-accuracy tradeoff in sensory adaptation , 2012, Nature Physics.

[24]  Masahito Ueda,et al.  Minimal energy cost for thermodynamic information processing: measurement and information erasure. , 2008, Physical review letters.

[25]  R. Landauer,et al.  Irreversibility and heat generation in the computing process , 1961, IBM J. Res. Dev..

[26]  M. N. Bera,et al.  Thermodynamics from Information , 2018, 1805.10282.

[27]  Aleksandra M. Walczak,et al.  Trade-Offs in Delayed Information Transmission in Biochemical Networks , 2015, 1504.03637.

[28]  Yuhai Tu,et al.  The nonequilibrium mechanism for ultrasensitivity in a biological switch: Sensing by Maxwell's demons , 2008, Proceedings of the National Academy of Sciences.

[29]  G. Iyengar,et al.  A lower bound on the free energy cost of molecular measurements , 2016 .

[30]  Pieter Rein ten Wolde,et al.  Energy dissipation and noise correlations in biochemical sensing. , 2014, Physical review letters.

[31]  Pieter Rein ten Wolde,et al.  Thermodynamics of Computational Copying in Biochemical Systems , 2015, 1503.00909.

[32]  Sebastian Goldt,et al.  Stochastic thermodynamics of learning , 2016, Physical review letters.

[33]  Charles H. Bennett,et al.  Notes on Landauer's Principle, Reversible Computation, and Maxwell's Demon , 2002, physics/0210005.

[34]  G. Crooks Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences. , 1999, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[35]  Udo Seifert,et al.  Cost and Precision of Brownian Clocks , 2016, 1610.07960.

[36]  A. C. Barato,et al.  Information-theoretic vs. thermodynamic entropy production in autonomous sensory networks , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[37]  Susanne Still,et al.  The thermodynamics of prediction , 2012, Physical review letters.

[38]  Udo Seifert,et al.  Efficiency of a Brownian information machine , 2012, 1203.0184.

[39]  Jordan M. Horowitz,et al.  Thermodynamics with Continuous Information Flow , 2014, 1402.3276.