A Strong Entropy Power Inequality

When one of the random summands is Gaussian, we sharpen the entropy power inequality (EPI) in terms of the strong data processing function for Gaussian channels. Among other consequences, this ‘strong’ EPI generalizes the vector extension of Costa’s EPI to non-Gaussian channels in a precise sense. This leads to a new reverse EPI and, as a corollary, sharpens Stam’s uncertainty principle relating entropy power and Fisher information (or, equivalently, Gross’ logarithmic Sobolev inequality). Applications to network information theory are also given, including a short self-contained proof of the rate region for the two-encoder quadratic Gaussian source coding problem and a new outer bound for the one-sided Gaussian interference channel.

[1]  Martin E. Hellman,et al.  The Gaussian wire-tap channel , 1978, IEEE Trans. Inf. Theory.

[2]  Peng Xu,et al.  Forward and Reverse Entropy Power Inequalities in Convex Geometry , 2016, ArXiv.

[3]  Shlomo Shamai,et al.  The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel , 2006, IEEE Transactions on Information Theory.

[4]  Nelson M. Blachman,et al.  The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.

[5]  Mokshay M. Madiman,et al.  Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.

[6]  Chandra Nair,et al.  Upper concave envelopes and auxiliary random variables , 2013 .

[7]  E. Carlen Superadditivity of Fisher's information and logarithmic Sobolev inequalities , 1991 .

[8]  Pramod Viswanath,et al.  Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem , 2006, ISIT.

[9]  Giuseppe Toscani,et al.  Stability Results for Logarithmic Sobolev and Gagliardo–Nirenberg Inequalities , 2014, 1412.0475.

[10]  H. Vincent Poor,et al.  Channel coding: non-asymptotic fundamental limits , 2010 .

[11]  O. Johnson Information Theory And The Central Limit Theorem , 2004 .

[12]  Hiroshi Sato,et al.  The capacity of the Gaussian interference channel under strong interference , 1981, IEEE Trans. Inf. Theory.

[13]  A. Barron,et al.  Fisher information inequalities and the central limit theorem , 2001, math/0111020.

[14]  Salman Beigi,et al.  Equivalent characterization of reverse Brascamp-Lieb-type inequalities using information measures , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[15]  John M. Cioffi,et al.  A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels , 2006, 2006 IEEE International Symposium on Information Theory.

[16]  Sudeep Kamath,et al.  Reverse hypercontractivity using information measures , 2015, 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[17]  CHANDRA NAIR,et al.  EQUIVALENT FORMULATIONS OF HYPERCONTRACTIVITY USING INFORMATION MEASURES , 2014 .

[18]  Thomas A. Courtade,et al.  An extremal inequality for long Markov chains , 2014, 2014 52nd Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[19]  Thomas A. Courtade,et al.  Strengthening the entropy power inequality , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[20]  M. Ledoux,et al.  Analysis and Geometry of Markov Diffusion Operators , 2013 .

[21]  Te Sun Han,et al.  A new achievable rate region for the interference channel , 1981, IEEE Trans. Inf. Theory.

[22]  Chandra Nair,et al.  Sub-optimality of Han-Kobayashi achievable region for interference channels , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[23]  Maxim Raginsky,et al.  Strong Data Processing Inequalities and $\Phi $ -Sobolev Inequalities for Discrete Channels , 2014, IEEE Transactions on Information Theory.

[24]  R. Durrett Probability: Theory and Examples , 1993 .

[25]  Jun Chen,et al.  On the Sum Rate of Gaussian Multiterminal Source Coding: New Proofs and Results , 2010, IEEE Transactions on Information Theory.

[26]  Edward Nelson The free Markoff field , 1973 .

[27]  Tsachy Weissman,et al.  Multiterminal Source Coding Under Logarithmic Loss , 2011, IEEE Transactions on Information Theory.

[28]  Yihong Wu,et al.  Wasserstein Continuity of Entropy and Outer Bounds for Interference Channels , 2015, IEEE Transactions on Information Theory.

[29]  Max H. M. Costa,et al.  Noisebergs in Z Gaussian interference channels , 2011, 2011 Information Theory and Applications Workshop.

[30]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[31]  Amir Dembo,et al.  Simple proof of the concavity of the entropy power with respect to Gaussian noise , 1989, IEEE Trans. Inf. Theory.

[32]  S. Bobkov,et al.  Reverse Brunn–Minkowski and reverse entropy power inequalities for convex measures , 2011, 1109.5287.

[33]  Mokshay Madiman,et al.  On the entropy of sums , 2008, 2008 IEEE Information Theory Workshop.

[34]  L. Gross LOGARITHMIC SOBOLEV INEQUALITIES. , 1975 .

[35]  S. Varadhan,et al.  Asymptotic evaluation of certain Markov process expectations for large time , 1975 .

[36]  T. Cover,et al.  IEEE TRANSACTIONSON INFORMATIONTHEORY,VOL. IT-30,N0. 6,NOVEmER1984 Correspondence On the Similarity of the Entropy Power Inequality The preceeding equations allow the entropy power inequality and the Brunn-Minkowski Inequality to be rewritten in the equiv , 2022 .

[37]  Max Fathi,et al.  Quantitative logarithmic Sobolev inequalities and stability estimates , 2014, 1410.6922.

[38]  Patrick P. Bergmans,et al.  A simple converse for broadcast channels with additive white Gaussian noise (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[39]  Igal Sason,et al.  Concentration of Measure Inequalities in Information Theory, Communications, and Coding , 2012, Found. Trends Commun. Inf. Theory.

[40]  E. Carlen,et al.  Subadditivity of The Entropy and its Relation to Brascamp–Lieb Type Inequalities , 2007, 0710.0870.

[41]  L. Ozarow,et al.  On a source-coding problem with two channels and three receivers , 1980, The Bell System Technical Journal.

[42]  E. Carlen,et al.  Entropy production by block variable summation and central limit theorems , 1991 .

[43]  Cyril Roberto,et al.  Bounds on the deficit in the logarithmic Sobolev inequality , 2014, 1408.2115.

[44]  E. Lieb Gaussian kernels have only Gaussian maximizers , 1990 .

[45]  Shlomo Shamai,et al.  A Vector Generalization of Costa's Entropy-Power Inequality With Applications , 2009, IEEE Transactions on Information Theory.

[46]  K. Ball,et al.  Solution of Shannon's problem on the monotonicity of entropy , 2004 .

[47]  Sui Tung,et al.  Multiterminal source coding (Ph.D. Thesis abstr.) , 1978, IEEE Trans. Inf. Theory.

[48]  Max H. M. Costa,et al.  Gaussian Z-interference channel: Around the corner , 2016, 2016 Information Theory and Applications Workshop (ITA).

[49]  Yihong Wu,et al.  Strong data processing inequalities in power-constrained Gaussian channels , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[50]  Cédric Villani,et al.  A short proof of the "Concavity of entropy power" , 2000, IEEE Trans. Inf. Theory.

[51]  Giuseppe Toscani,et al.  A Strengthened Entropy Power Inequality for Log-Concave Densities , 2014, IEEE Transactions on Information Theory.

[52]  Sergio Verdú,et al.  Brascamp-Lieb inequality and its reverse: An information theoretic view , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[53]  Thomas A. Courtade Concavity of entropy power: Equivalent formulations and generalizations , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[54]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[55]  Yihong Wu,et al.  Dissipation of Information in Channels With Input Constraints , 2014, IEEE Transactions on Information Theory.

[56]  Tsachy Weissman,et al.  Justification of logarithmic loss via the benefit of side information , 2014, ISIT.

[57]  Sergio Verdú,et al.  Functional Properties of Minimum Mean-Square Error and Mutual Information , 2012, IEEE Transactions on Information Theory.

[58]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[59]  Vinod M. Prabhakaran,et al.  Rate region of the quadratic Gaussian CEO problem , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[60]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[61]  Amir K. Khandani,et al.  The Secrecy Capacity Region of the Gaussian MIMO Broadcast Channel , 2009, IEEE Transactions on Information Theory.

[62]  Chandra Nair,et al.  The Capacity Region of the Two-Receiver Gaussian Vector Broadcast Channel With Private and Common Messages , 2014, IEEE Transactions on Information Theory.

[63]  Yasutada Oohama,et al.  Rate-distortion theory for Gaussian multiterminal source coding systems with several side informations at the decoder , 2005, IEEE Transactions on Information Theory.

[64]  Max H. M. Costa,et al.  On the Gaussian interference channel , 1985, IEEE Trans. Inf. Theory.

[65]  Van Hoang Nguyen,et al.  Entropy jumps for isotropic log-concave random vectors and spectral gap , 2012, 1206.5098.

[66]  W. Bryc The Normal Distribution: Characterizations with Applications , 1995 .

[67]  Thomas A. Courtade Outer bounds for multiterminal source coding via a strong data processing inequality , 2013, 2013 IEEE International Symposium on Information Theory.

[68]  Max H. M. Costa,et al.  A new entropy power inequality , 1985, IEEE Trans. Inf. Theory.

[69]  Y. Oohama Gaussian multiterminal source coding , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[70]  F. Barthe Optimal young's inequality and its converse: a simple proof , 1997, math/9704210.

[71]  Olivier Rioul,et al.  Information Theoretic Proofs of Entropy Power Inequalities , 2007, IEEE Transactions on Information Theory.