Mutual Information, Relative Entropy and Estimation Error in Semi-Martingale Channels

Fundamental relations between information and estimation have been established in the literature for the continuous time Gaussian and Poisson channels. In this paper, we demonstrate that such relations hold for a much larger family of continuous-time channels. We introduce the family of semi-martingale channels where the channel output is a semi-martingale stochastic process, and the channel input modulates the characteristics of the semi-martingale. For these channels, which includes as a special case the continuous time Gaussian and Poisson models, we establish new representations relating the mutual information between the channel input and output to an optimal causal filtering loss, thereby unifying and considerably extending results from the Gaussian and Poisson settings. Extensions to the setting of mismatched estimation are also presented where the relative entropy between the laws governing the output of the channel under two different input distributions is equal to the cumulative difference between the estimation loss incurred by using the mismatched and optimal causal filters, respectively. The main tool underlying these results is the Doob–Meyer decomposition of a class of sub-martingales. The results in this paper can be viewed as the continuous-time analogues of recent generalizations for relations between information and estimation for discrete-time Lévy channels.

[1]  Jacob Ziv,et al.  Mutual information of the white Gaussian channel with and without feedback , 1971, IEEE Trans. Inf. Theory.

[2]  Tsachy Weissman,et al.  Pointwise relations between information and estimation in the Poisson channel , 2013, 2013 IEEE International Symposium on Information Theory.

[3]  On absolute continuity of probability measures for markov-itô processes , 1980 .

[4]  Tyrone E. Duncan Mutual Information for Stochastic Signals and LÉvy Processes , 2010, IEEE Transactions on Information Theory.

[5]  Tsachy Weissman,et al.  The Relationship Between Causal and Noncausal Mismatched Estimation in Continuous-Time AWGN Channels , 2010, IEEE Transactions on Information Theory.

[6]  T. Duncan ON THE CALCULATION OF MUTUAL INFORMATION , 1970 .

[7]  Oliver Johnson,et al.  A de Bruijn identity for symmetric stable laws , 2013, ArXiv.

[8]  Y. Kabanov The Capacity of a Channel of the Poisson Type , 1978 .

[9]  J. Jacod,et al.  Caractéristiques locales et conditions de continuité absolue pour les semi-martingales , 1976 .

[10]  B. Grigelionis Mutual information for locally infinitely divisible random processes , 1974 .

[11]  A. N. Sirjaev,et al.  ABSOLUTE CONTINUITY AND SINGULARITY OF LOCALLY ABSOLUTELY CONTINUOUS PROBABILITY DISTRIBUTIONS. I , 1979 .

[12]  Tsachy Weissman,et al.  Pointwise Relations Between Information and Estimation in Gaussian Noise , 2012, IEEE Trans. Inf. Theory.

[13]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[14]  Sergio Verdú,et al.  Functional Properties of Minimum Mean-Square Error and Mutual Information , 2012, IEEE Transactions on Information Theory.

[15]  Shlomo Shamai,et al.  Mutual information and conditional mean estimation in Poisson channels , 2004, ITW.

[16]  Tsachy Weissman,et al.  Relations between information and estimation in scalar Lévy channels , 2014, 2014 IEEE International Symposium on Information Theory.

[17]  Tsachy Weissman,et al.  Mutual Information, Relative Entropy, and Estimation in the Poisson Channel , 2012, IEEE Trans. Inf. Theory.