Optimum estimation via partition functions and information measures

In continuation to a recent work on the statistical-mechanical analysis of optimum estimation in Gaussian noise via its relation to the mutual information (I-MMSE relation), here we propose a more direct relation between optimum estimation and some information measures, which can be viewed as partition functions and hence are amenable to statistical-mechanical analysis. This approach has several advantages, most notably, its applicability to general sources/channels, as opposed to the I-MMSE relation and its variants which hold only for certain classes of channels. We also demonstrate the derivation of the optimum estimator and the MMSE in a few examples. One of them is generalizable to a fairly wide class of sources and channels. For this class, our approach yields an approximate conditional mean estimator and an MMSE formula that has the flavor of a single-letter expression.

[1]  Richard S. Bucy,et al.  Information and filtering , 1979, Inf. Sci..

[2]  Daniel Pérez Palomar,et al.  Gradient of mutual information in linear vector Gaussian channels , 2005, IEEE Transactions on Information Theory.

[3]  T. Kailath The innovations approach to detection and estimation theory , 1970 .

[4]  Dongning Guo,et al.  Relative entropy and score function: New information-estimation relationships through arbitrary additive perturbation , 2009, 2009 IEEE International Symposium on Information Theory.

[5]  Shlomo Shamai,et al.  Additive non-Gaussian noise channels: mutual information and conditional mean estimation , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[6]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[7]  Shlomo Shamai,et al.  Statistical Physics of Signal Estimation in Gaussian Noise: Theory and Examples of Phase Transitions , 2008, IEEE Transactions on Information Theory.

[8]  Todd P. Coleman,et al.  Mutual information and posterior estimates in channels of exponential family type , 2009, 2009 IEEE Information Theory Workshop.

[9]  Neri Merhav Optimum Estimation via Gradients of Partition Functions and Information Measures: A Statistical-Mechanical Perspective , 2011, IEEE Transactions on Information Theory.

[10]  Sergio Verdú,et al.  Mismatched Estimation and Relative Entropy , 2009, IEEE Transactions on Information Theory.

[11]  Shlomo Shamai,et al.  Mutual Information and Conditional Mean Estimation in Poisson Channels , 2004, IEEE Transactions on Information Theory.

[12]  Andrea Montanari,et al.  The Generalized Area Theorem and Some of its Consequences , 2005, IEEE Transactions on Information Theory.

[13]  T. Duncan ON THE CALCULATION OF MUTUAL INFORMATION , 1970 .

[14]  Daniel Pérez Palomar,et al.  Representation of Mutual Information Via Input Estimates , 2007, IEEE Transactions on Information Theory.

[15]  Sergio Verdú,et al.  A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.

[16]  S. Kak Information, physics, and computation , 1996 .