Entropy power inequality for a family of discrete random variables

It is known that the Entropy Power Inequality (EPI) always holds if the random variables have density. Not much work has been done to identify discrete distributions for which the inequality holds with the differential entropy replaced by the discrete entropy. Harremoës and Vignat showed that it holds for the pair (B(m, p),B(n, p)), m, n ∈ ℕ, (where B(n, p) is a Binomial distribution with n trials each with success probability p) for p = 0.5. In this paper, we considerably expand the set of Binomial distributions for which the inequality holds and, in particular, identify n0(p) such that for all m, n ≥ n0(p), the EPI holds for (B(m, p),B(n, p)). We further show that the EPI holds for the discrete random variables that can be expressed as the sum of n independent and identically distributed (IID) discrete random variables for large n.

[1]  E. Lukács Applications of Faá Di Bruno's Formula in Mathematical Statistics , 1955 .

[2]  R. Guy,et al.  The Book of Numbers , 2019, The Crimean Karaim Bible.

[3]  Oliver Johnson,et al.  Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality , 2009, IEEE Transactions on Information Theory.

[4]  Flemming Topsøe,et al.  Some inequalities for information divergence and related measures of discrimination , 2000, IEEE Trans. Inf. Theory.

[5]  W. Rudin Principles of mathematical analysis , 1964 .

[6]  Amir Dembo,et al.  Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.

[7]  Antonia Maria Tulino,et al.  Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.

[8]  Christophe Vignat,et al.  AN ENTROPY POWER INEQUALITY FOR THE BINOMIAL FAMILY , 2003 .

[9]  Max H. M. Costa,et al.  A new entropy power inequality , 1985, IEEE Trans. Inf. Theory.

[10]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[11]  Charles Knessl,et al.  Integral representations and asymptotic expansions for Shannon and Renyi entropies , 1998 .

[12]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[13]  Sergio Verdú,et al.  A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.

[14]  Yaming Yu,et al.  Monotonic Convergence in an Information-Theoretic Law of Small Numbers , 2008, IEEE Transactions on Information Theory.

[15]  K. Ball,et al.  Solution of Shannon's problem on the monotonicity of entropy , 2004 .

[16]  Nelson M. Blachman,et al.  The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.

[17]  Mokshay M. Madiman,et al.  Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.

[18]  Shlomo Shamai,et al.  A binary analog to the entropy-power inequality , 1990, IEEE Trans. Inf. Theory.

[19]  Oliver Johnson,et al.  Thinning, Entropy, and the Law of Thin Numbers , 2009, IEEE Transactions on Information Theory.

[20]  O. Johnson Log-concavity and the maximum entropy property of the Poisson distribution , 2006, math/0603647.