The entropy power of a sum is fractionally superadditive

It is shown that the entropy power of a sum of independent random vectors, seen as a set function, is fractionally superadditive. This resolves a conjecture of the first author and A. R. Barron, and implies in particular all previously known entropy power inequalities for independent random variables. It is also shown that, for general dimension, the entropy power of a sum of independent random vectors is not supermodular.

[1]  Antonia Maria Tulino,et al.  Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.

[2]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[3]  Mokshay Madiman,et al.  On the entropy of sums , 2008, 2008 IEEE Information Theory Workshop.

[4]  Mokshay M. Madiman,et al.  Cores of Cooperative Games in Information Theory , 2008, EURASIP J. Wirel. Commun. Netw..

[5]  L. Ozarow,et al.  On a source-coding problem with two channels and three receivers , 1980, The Bell System Technical Journal.

[6]  Mokshay M. Madiman,et al.  Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.

[7]  Olivier Rioul A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information , 2007, 2007 IEEE International Symposium on Information Theory.

[8]  Te Sun Han Nonnegative Entropy Measures of Multivariate Symmetric Correlations , 1978, Inf. Control..

[9]  Mokshay Madiman,et al.  The Monotonicity of Information in the Central Limit Theorem and Entropy Power Inequalities , 2006, 2006 IEEE International Symposium on Information Theory.

[10]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[11]  Max H. M. Costa,et al.  On the Gaussian interference channel , 1985, IEEE Trans. Inf. Theory.

[12]  A. Barron ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .

[13]  Mokshay M. Madiman,et al.  Information Inequalities for Joint Distributions, With Interpretations and Applications , 2008, IEEE Transactions on Information Theory.

[14]  Fuzhen Zhang Matrix Theory: Basic Results and Techniques , 1999 .

[15]  Fan Chung Graham,et al.  Some intersection theorems for ordered sets and graphs , 1986, J. Comb. Theory, Ser. A.

[16]  Satoru Fujishige,et al.  Polymatroidal Dependence Structure of a Set of Random Variables , 1978, Inf. Control..

[17]  D. M. Topkis Supermodularity and Complementarity , 1998 .

[18]  Yasutada Oohama,et al.  The Rate-Distortion Function for the Quadratic Gaussian CEO Problem , 1998, IEEE Trans. Inf. Theory.

[19]  K. Ball,et al.  Solution of Shannon's problem on the monotonicity of entropy , 2004 .

[20]  Richard D. Gill,et al.  An algorithmic and a geometric characterization of Coarsening At Random , 2005, ArXiv.

[21]  P. Macilwaine,et al.  Basic results and techniques , 1984 .

[22]  J. Kahn,et al.  On the number of copies of one hypergraph in another , 1998 .

[23]  Zhen Zhang,et al.  On Characterization of Entropy Function via Information Inequalities , 1998, IEEE Trans. Inf. Theory.

[24]  Patrick P. Bergmans,et al.  A simple converse for broadcast channels with additive white Gaussian noise (Corresp.) , 1974, IEEE Trans. Inf. Theory.