Certain Relations between Mutual Information and Fidelity of Statistical Estimation

I present several new relations between mutual information (MI) and statistical estimation error for a system that can be regarded simultaneously as a communication channel and as an estimator of an input parameter. I first derive a second-order result between MI and Fisher information (FI) that is valid for sufficiently narrow priors, but arbitrary channels. A second relation furnishes a lower bound on the MI in terms of the minimum mean-squared error (MMSE) on the Bayesian estimation of the input parameter from the channel output, one that is valid for arbitrary channels and priors. The existence of such a lower bound, while extending previous work relating the MI to the FI that is valid only in the asymptotic and high-SNR limits, elucidates further the fundamental connection between information and estimation theoretic measures of fidelity. The remaining relations I present are inequalities and correspondences among MI, FI, and MMSE in the presence of nuisance parameters.

[1]  Umberto Mengali,et al.  The modified Cramer-Rao bound and its application to synchronization problems , 1994, IEEE Trans. Commun..

[2]  Nicolas Brunel,et al.  Mutual Information, Fisher Information, and Population Coding , 1998, Neural Computation.

[3]  Andrew R. Barron,et al.  Information-theoretic asymptotics of Bayes methods , 1990, IEEE Trans. Inf. Theory.

[4]  P. Seriès,et al.  Fisher vs Shannon information in Populations of Neurons , 2008 .

[5]  H. Sompolinsky,et al.  Mutual information of population codes and distance measures in probability space. , 2001, Physical review letters.

[6]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[7]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[8]  Moshe Zakai,et al.  Some relations between mutual information and estimation error in Wiener space , 2006 .

[9]  T. Duncan ON THE CALCULATION OF MUTUAL INFORMATION , 1970 .

[10]  A. Gualtierotti H. L. Van Trees, Detection, Estimation, and Modulation Theory, , 1976 .

[11]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[12]  Jorma Rissanen,et al.  Fisher information and stochastic complexity , 1996, IEEE Trans. Inf. Theory.

[13]  D. G. Chapman,et al.  Minimum Variance Estimation Without Regularity Assumptions , 1951 .

[14]  S. Kay Fundamentals of statistical signal processing: estimation theory , 1993 .

[15]  N. L. Johnson,et al.  Linear Statistical Inference and Its Applications , 1966 .