Additive non-Gaussian noise channels: mutual information and conditional mean estimation

It has recently been shown that the derivative of the input-output mutual information of Gaussian noise channels with respect to the signal-to-noise ratio is equal to the minimum mean-square error. This paper considers general additive noise channels where the noise may not be Gaussian distributed. It is found that, for every fixed input distribution, the derivative of the mutual information with respect to the signal strength is equal to the correlation of two conditional mean estimates associated with the input and the noise respectively. Special versions of the result are given in the respective cases of additive exponentially distributed noise, Cauchy noise, Laplace noise, and Rayleigh noise. The previous result on Gaussian noise channels is also recovered as a special case

[1]  Solomon Kullback,et al.  Information Theory and Statistics , 1970, The Mathematical Gazette.

[2]  Sergio Verdú,et al.  Spectral efficiency in the wideband regime , 2002, IEEE Trans. Inf. Theory.

[3]  T. Duncan ON THE CALCULATION OF MUTUAL INFORMATION , 1970 .

[4]  Shlomo Shamai,et al.  Fading channels: How perfect need "Perfect side information" be? , 2002, IEEE Trans. Inf. Theory.

[5]  Sergio Verdú,et al.  On channel capacity per unit cost , 1990, IEEE Trans. Inf. Theory.

[6]  Krishna R. Narayanan,et al.  An MSE Based Ttransfer Chart to Analyze Iterative Decoding Schemes , 2005, ArXiv.

[7]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[8]  W. Rudin Principles of mathematical analysis , 1964 .

[9]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[10]  Andrea Montanari,et al.  Life Above Threshold: From List Decoding to Area Theorem and MSE , 2004, ArXiv.

[11]  Antonia Maria Tulino,et al.  Mercury/waterfilling: optimum power allocation with arbitrary input constellations , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[12]  Sergio Verdú,et al.  Second-order asymptotics of mutual information , 2004, IEEE Transactions on Information Theory.

[13]  Shlomo Shamai,et al.  Mutual Information and Conditional Mean Estimation in Poisson Channels , 2004, IEEE Transactions on Information Theory.

[14]  Sergio Verdu,et al.  The exponential distribution in information theory , 1996 .