Continuity of Mutual Entropy in the Limiting Signal-To-Noise Ratio Regimes

This article addresses the issue of the proof of the entropy power inequality (EPI), an important tool in the analysis of Gaussian channels of information transmission, proposed by Shannon. We analyse continuity properties of the mutual entropy of the input and output signals in an additive memoryless channel and discuss assumptions under which the entropy-power inequality holds true.

[1]  Olivier Rioul A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information , 2007, 2007 IEEE International Symposium on Information Theory.

[2]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[3]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[4]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[5]  E. Lieb Proof of an entropy conjecture of Wehrl , 1978 .

[6]  Claude E. Shannon,et al.  The mathematical theory of communication , 1950 .

[7]  Oliver Johnson,et al.  Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality , 2009, IEEE Transactions on Information Theory.

[8]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[9]  R. L. Dobrushin Passage to the Limit under the Information and Entropy Signs , 1960 .

[10]  Sergio Verdú,et al.  A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.