Proof of Entropy Power Inequalities Via MMSE

The differential entropy of a random variable (or vector) can be expressed as the integral over signal-to-noise ratio (SNR) of the minimum mean-square error (MMSE) of estimating the variable (or vector) when observed in additive Gaussian noise. This representation sidesteps Fisher's information to provide simple and insightful proofs for Shannon's entropy power inequality (EPI) and two of its variations: Costa's strengthened EPI in the case in which one of the variables is Gaussian, and a generalized EPI for linear transformations of a random vector due to Zamir and Feder

[1]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[2]  Ram Zamir,et al.  A Proof of the Fisher Information Inequality via a Data Processing Argument , 1998, IEEE Trans. Inf. Theory.

[3]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[4]  Amir Dembo,et al.  Simple proof of the concavity of the entropy power with respect to Gaussian noise , 1989, IEEE Trans. Inf. Theory.

[5]  Meir Feder,et al.  Rate-distortion performance in coding bandlimited sources by sampling and dithered quantization , 1995, IEEE Trans. Inf. Theory.

[6]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[7]  Amos Lapidoth,et al.  Capacity bounds via duality with applications to multiple-antenna systems on flat-fading channels , 2003, IEEE Trans. Inf. Theory.

[8]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[9]  Max H. M. Costa,et al.  A new entropy power inequality , 1985, IEEE Trans. Inf. Theory.

[10]  Patrick P. Bergmans,et al.  A simple converse for broadcast channels with additive white Gaussian noise (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[11]  E. Lieb Proof of an entropy conjecture of Wehrl , 1978 .

[12]  Tyrone E. Duncan,et al.  Likelihood Functions for Stochastic Signals in White Noise , 1970, Inf. Control..

[13]  Antonia Maria Tulino,et al.  Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.

[14]  Jacob Binia Divergence and minimum mean-square error in continuous-time additive white Gaussian noise channels , 2006, IEEE Transactions on Information Theory.

[15]  Yasutada Oohama Gaussian multiterminal source coding , 1997, IEEE Trans. Inf. Theory.

[16]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[17]  Yasutada Oohama,et al.  The Rate-Distortion Function for the Quadratic Gaussian CEO Problem , 1998, IEEE Trans. Inf. Theory.

[18]  Daniel Pérez Palomar,et al.  Gradient of mutual information in linear vector Gaussian channels , 2006, IEEE Transactions on Information Theory.

[19]  Sergio Verdú,et al.  A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.

[20]  Cédric Villani,et al.  A short proof of the "Concavity of entropy power" , 2000, IEEE Trans. Inf. Theory.

[21]  Amir Dembo,et al.  Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.