Information and estimation in Fokker-Planck channels

We study the relationship between information- and estimation-theoretic quantities in time-evolving systems. We focus on the Fokker-Planck channel defined by a general stochastic differential equation, and show that the time derivatives of entropy, KL divergence, and mutual information are characterized by estimation-theoretic quantities involving an appropriate generalization of the Fisher information. Our results vastly extend De Bruijn's identity and the classical I-MMSE relation.

[1]  E. L. Lehmann,et al.  Theory of point estimation , 1950 .

[2]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[3]  F. Black,et al.  The Pricing of Options and Corporate Liabilities , 1973, Journal of Political Economy.

[4]  Max H. M. Costa,et al.  A new entropy power inequality , 1985, IEEE Trans. Inf. Theory.

[5]  G. North,et al.  Information Theory and Climate Prediction , 1990 .

[6]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[7]  R. Gill,et al.  Applications of the van Trees inequality : a Bayesian Cramr-Rao bound , 1995 .

[8]  S. Griffies,et al.  A Conceptual Framework for Predictability Studies , 1999 .

[9]  R. Kleeman Measuring Dynamical Prediction Utility Using Relative Entropy , 2002 .

[10]  C. Villani Topics in Optimal Transportation , 2003 .

[11]  Timothy DelSole,et al.  Predictability and Information Theory. Part I: Measures of Predictability , 2004 .

[12]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[13]  Daniel Pérez Palomar,et al.  Representation of Mutual Information Via Input Estimates , 2007, IEEE Transactions on Information Theory.

[14]  Sergio Verdú,et al.  Mismatched Estimation and Relative Entropy , 2009, IEEE Transactions on Information Theory.

[15]  S. Shreve Stochastic Calculus for Finance II: Continuous-Time Models , 2010 .

[16]  Tsachy Weissman,et al.  Pointwise Relations Between Information and Estimation in Gaussian Noise , 2012, IEEE Transactions on Information Theory.

[17]  Tsachy Weissman,et al.  Mutual information, relative entropy, and estimation in the Poisson channel , 2010, 2011 IEEE International Symposium on Information Theory Proceedings.

[18]  Jean-François Bercher,et al.  Some properties of generalized Fisher information in the context of nonextensive thermostatistics , 2013, ArXiv.

[19]  Tsachy Weissman,et al.  Pointwise relations between information and estimation in the Poisson channel , 2013, 2013 IEEE International Symposium on Information Theory.

[20]  Tsachy Weissman,et al.  Relations between information and estimation in scalar Lévy channels , 2014, 2014 IEEE International Symposium on Information Theory.

[21]  Fan Cheng,et al.  Higher Order Derivatives in Costa’s Entropy Power Inequality , 2014, IEEE Transactions on Information Theory.

[22]  Steeve Zozor,et al.  Generalization of the de Bruijn Identity to General $\phi$ -Entropies and $\phi$ -Fisher Informations , 2016, IEEE Transactions on Information Theory.