Some old and new relations between information and estimation

Elegant relationships between classical quantities in information and estimation arise in the presence of additive Gaussian noise. In this talk, we review and present these relationships as expectations of random quantities, and subsequently present some of our recent findings pertaining to their ‘pointwise’ analogues. Duncan, in [1], showed that for the continuous-time additive white Gaussian noise channel, the minimum mean squared filtering(causal estimation) error is twice the input-output mutual information for any underlying signal distribution. Another discovery was made by Guo et al. in [2], where the derivative of the mutual information was found to equal half the minimum mean squared error in non-causal estimation. By combining these two intriguing results, the authors of [2] also establish the remarkable equality of the causal mean squared error (at some ‘signal to noise’ level snr) and the non-causal error averaged over ‘signal to noise’ ratio uniformly distributed between 0 and snr. There have been extensions of these results to the presence of mismatch. In this case, the relative entropy and the difference of the mismatched and matched mean squared errors are bridged together: Mismatched estimation in the scalar Gaussian channel was considered by Verdú in [3]. In [4], a generalization of Duncan’s result to incorporate mismatch for the full generality of continuous time processes is provided. In [5], Kadota et al. generalize Duncan’s theorem to the presence of feedback. In [6], the pointwise analouges of these relationships are obtained, giving considerable insight into the above results. As an illustration, consider Duncan’s 1970 result, which can equivalently be expressed saying that the difference between the input-output information density and half the causal estimation error is a zero mean random variable (regardless of the distribution of the channel input). We characterize this random variable explicitly, rather than merely its expectation. Classical estimation and information theoretic quantities emerge with new and surprising roles. For example, the variance of this random variable turns out to be given by the causal MMSE (which, in turn, is equal to twice the mutual information by Duncan’s result).