Pointwise relations between information and estimation in the Poisson channel

Identities yielding optimal estimation interpretations for mutual information and relative entropy - paralleling those known for minimum mean squared estimation under additive Gaussian noise - were recently discovered for the Poisson channel by Atar and Weissman. We express these identities as equalities between expectations of the associated estimation and information theoretic random variables such as the actual estimation loss and the information density. By explicitly characterizing the relations between these random variables we show that they are related in much stronger pointwise senses that directly imply the known expectation identities while deepening our understanding of them. As an example for the nature of our results, consider the equality between the mutual information and the mean cumulative filtering loss of the optimal filter in continuous-time estimation. We show that the difference between the information density and the cumulative filtering loss is a martingale expressible as a stochastic integral. This explicit characterization not only directly recovers the previously known expectation relation, but allows to characterize other distributional properties of the random variables involved where some of the original objects of interest emerge in new and surprising roles. For example, we find that the increasing predictable part of the Doob-Meyer decomposition of the information density (which is a sub-martinagle) is nothing but the cumulative loss of the optimal filter.

[1]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[2]  T. Duncan ON THE CALCULATION OF MUTUAL INFORMATION , 1970 .

[3]  P. Brémaud Point Processes and Queues , 1981 .

[4]  P. Brémaud Point processes and queues, martingale dynamics , 1983 .

[5]  A. Shiryayev,et al.  Statistics of Random Processes I: General Theory , 1984 .

[6]  A. Barron ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .

[7]  J. Grandell Mixed Poisson Processes , 1997 .

[8]  A. Shiryayev,et al.  Statistics of Random Processes Ii: Applications , 2000 .

[9]  S. Verdú Poisson Communication Theory , 2004 .

[10]  Shlomo Shamai,et al.  Mutual information and conditional mean estimation in Poisson channels , 2004, ITW.

[11]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[12]  Moshe Zakai,et al.  On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel , 2004, IEEE Transactions on Information Theory.

[13]  Sergio Verdú,et al.  Mismatched Estimation and Relative Entropy , 2009, IEEE Transactions on Information Theory.

[14]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[15]  Tsachy Weissman,et al.  The Relationship Between Causal and Noncausal Mismatched Estimation in Continuous-Time AWGN Channels , 2010, IEEE Transactions on Information Theory.

[16]  T. Weissman The relationship between causal and non-causal mismatched estimation in continuous-time AWGN channels , 2010, 2010 IEEE Information Theory Workshop on Information Theory (ITW 2010, Cairo).

[17]  Tsachy Weissman,et al.  Pointwise Relations Between Information and Estimation in Gaussian Noise , 2012, IEEE Transactions on Information Theory.

[18]  Tsachy Weissman,et al.  Mutual information, relative entropy, and estimation in the Poisson channel , 2010, 2011 IEEE International Symposium on Information Theory Proceedings.

[19]  Tetsunao Matsuta,et al.  国際会議開催報告:2013 IEEE International Symposium on Information Theory , 2013 .