Continuity of mutual entropy in the large signal-to-noise ratio limit

This article addresses the issue of the proof of the entropy power inequality (EPI), an important tool in the analysis of Gaussian channels of information transmission, proposed by Shannon. We analyse continuity properties of the mutual entropy of the input and output signals in an additive memoryless channel and discuss assumptions under which the entropy-power inequality holds true.

[1]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[2]  Shunsuke Ihara,et al.  Mutual information and mean square error , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[3]  Olivier Rioul A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information , 2007, 2007 IEEE International Symposium on Information Theory.

[4]  R. L. Dobrushin Passage to the Limit under the Information and Entropy Signs , 1960 .

[5]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[6]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[7]  Oliver Johnson,et al.  Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality , 2009, IEEE Transactions on Information Theory.

[8]  Sergio Verdú,et al.  A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.