The relationship between causal and non-causal mismatched estimation in continuous-time AWGN channels

A continuous-time finite-power process with distribution P is observed through an AWGN channel, at a given signal-to-noise ratio (SNR), and is estimated by an estimator that would have minimized the mean-square error if the process had distribution Q. We show that the causal filtering meansquare error (MSE) achieved at SNR level snr is equal to the average value of the noncausal (smoothing) MSE achieved with a channel whose SNR is chosen uniformly distributed between 0 and snr. Emerging as the bridge for equating these two quantities are mutual information and relative entropy. Our result generalizes that of Guo, Shamai and Verdú (2005) from the non-mismatched case, where P = Q, to general P and Q. Among our intermediate results is an extension of Duncan's theorem, that relates mutual information and causal MMSE, to the case of mismatched estimation. Some further extensions and implications are discussed. Key to our findings is the recent result of Verdú on mismatched estimation and relative entropy.

[1]  E. L. Lehmann,et al.  Theory of point estimation , 1950 .

[2]  Andrei N. Kolmogorov,et al.  On the Shannon theory of information transmission in the case of continuous signals , 1956, IRE Trans. Inf. Theory.

[3]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[4]  I. V. Girsanov On Transforming a Certain Class of Stochastic Processes by Absolutely Continuous Substitution of Measures , 1960 .

[5]  Amiel Feinstein,et al.  Information and information stability of random variables and processes , 1964 .

[6]  Tyrone E. Duncan,et al.  Evaluation of Likelihood Functions , 1968, Inf. Control..

[7]  T. Duncan ON THE CALCULATION OF MUTUAL INFORMATION , 1970 .

[8]  Jacob Ziv,et al.  Mutual information of the white Gaussian channel with and without feedback , 1971, IEEE Trans. Inf. Theory.

[9]  T. Kailath The Structure of Radon-Nikodym Derivatives with Respect to Wiener and Related Measures , 1971 .

[10]  T. T. Kadota,et al.  Capacity of a continuous memoryless channel with feedback , 1971, IEEE Trans. Inf. Theory.

[11]  Steven Orey,et al.  Conditions for the absolute continuity of two diffusions , 1974 .

[12]  Alʹbert Nikolaevich Shiri︠a︡ev,et al.  Statistics of random processes , 1977 .

[13]  Aaron D. Wyner,et al.  A Definition of Conditional Mutual Information for Arbitrary Ensembles , 1978, Inf. Control..

[14]  A. Barron ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .

[15]  Ioannis Karatzas,et al.  Brownian Motion and Stochastic Calculus , 1987 .

[16]  Neri Merhav,et al.  A strong version of the redundancy-capacity theorem of universal coding , 1995, IEEE Trans. Inf. Theory.

[17]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[18]  Shlomo Shamai,et al.  Mutual Information and Conditional Mean Estimation in Poisson Channels , 2004, IEEE Transactions on Information Theory.

[19]  K. Ball,et al.  Solution of Shannon's problem on the monotonicity of entropy , 2004 .

[20]  Daniel Pérez Palomar,et al.  Gradient of mutual information in linear vector Gaussian channels , 2006, IEEE Transactions on Information Theory.

[21]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[22]  Moshe Zakai,et al.  On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel , 2004, IEEE Transactions on Information Theory.

[23]  Antonia Maria Tulino,et al.  Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.

[24]  Antonia Maria Tulino,et al.  Optimum power allocation for parallel Gaussian channels with arbitrary input distributions , 2006, IEEE Transactions on Information Theory.

[25]  Jacob Binia Divergence and minimum mean-square error in continuous-time additive white Gaussian noise channels , 2006, IEEE Transactions on Information Theory.

[26]  Sergio Verdú,et al.  A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.

[27]  Moshe Zakai,et al.  Some relations between mutual information and estimation error in Wiener space , 2006 .

[28]  Gregoire Nicolis,et al.  Stochastic resonance , 2007, Scholarpedia.

[29]  Daniel Pérez Palomar,et al.  Representation of Mutual Information Via Input Estimates , 2007, IEEE Transactions on Information Theory.

[30]  Mokshay M. Madiman,et al.  Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.

[31]  Tsachy Weissman,et al.  Scanning and Sequential Decision Making for Multidimensional Data—Part II: The Noisy Case , 2008, IEEE Transactions on Information Theory.

[32]  Dongning Guo,et al.  Relative entropy and score function: New information-estimation relationships through arbitrary additive perturbation , 2009, 2009 IEEE International Symposium on Information Theory.

[33]  Haim H. Permuter,et al.  Directed information and causal estimation in continuous time , 2009, 2009 IEEE International Symposium on Information Theory.

[34]  Sergio Verdú,et al.  Mismatched Estimation and Relative Entropy , 2009, IEEE Transactions on Information Theory.

[35]  Tyrone E. Duncan Mutual Information for Stochastic Signals and LÉvy Processes , 2010, IEEE Transactions on Information Theory.

[36]  Tsachy Weissman,et al.  The Relationship Between Causal and Noncausal Mismatched Estimation in Continuous-Time AWGN Channels , 2010, IEEE Transactions on Information Theory.

[37]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .