On the entropy and log-concavity of compound Poisson measures

Motivated, in part, by the desire to develop an information-theoretic foundation for compound Poisson approximation limit theorems (analogous to the corresponding developments for the central limit theorem and for simple Poisson approximation), this work examines sufficient conditions under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. We show that the natural analog of the Poisson maximum entropy property remains valid if the measures under consideration are log-concave, but that it fails in general. A parallel maximum entropy result is established for the family of compound binomial measures. The proofs are largely based on ideas related to the semigroup approach introduced in recent work by Johnson for the Poisson family. Sufficient conditions are given for compound distributions to be log-concave, and specific examples are presented illustrating all the above results.

[1]  Iain M. Johnstone,et al.  Une mesure d'information caractérisant la loi de Poisson , 1987 .

[2]  Mokshay M. Madiman,et al.  Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.

[3]  A. Barron ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .

[4]  Fw Fred Steutel,et al.  Discrete analogues of self-decomposability and stability , 1979 .

[5]  Peter Harremoës,et al.  Binomial and Poisson distributions as maximum entropy distributions , 2001, IEEE Trans. Inf. Theory.

[6]  J. Cooper TOTAL POSITIVITY, VOL. I , 1970 .

[7]  Louis H. Y. Chen,et al.  Compound Poisson Approximation for Nonnegative Random Variables Via Stein's Method , 1992 .

[8]  O. Johnson Log-concavity and the maximum entropy property of the Poisson distribution , 2006, math/0603647.

[9]  Antonia Maria Tulino,et al.  Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.

[10]  P. Mateev On the Entropy of the Multinomial Distribution , 1978 .

[11]  J. Keilson,et al.  Some Results for Discrete Unimodality , 1971 .

[12]  Yeong-Nan Yeh,et al.  Log-concavity and LC-positivity , 2007, J. Comb. Theory, Ser. A.

[13]  Ingram Olkin,et al.  Entropy of the Sum of Independent Bernoulli Random Variables and of the Multinomial Distribution , 1981 .

[14]  B. Gnedenko,et al.  Random Summation: Limit Theorems and Applications , 1996 .

[15]  Oliver Johnson,et al.  Preservation of log-concavity on summation , 2005, math/0502548.

[16]  Zhen Zhang,et al.  On the maximum entropy of the sum of two dependent random variables , 1994, IEEE Trans. Inf. Theory.

[17]  R. Pemantle Towards a theory of negative dependence , 2000, math/0404095.

[18]  J. Linnik An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions , 1959 .

[19]  K. Ball,et al.  Solution of Shannon's problem on the monotonicity of entropy , 2004 .

[20]  O. Johnson Information Theory And The Central Limit Theorem , 2004 .

[21]  Oliver Johnson,et al.  Entropy and the law of small numbers , 2005, IEEE Transactions on Information Theory.

[22]  J. Keilson,et al.  Uniform stochastic ordering and related inequalities , 1982 .

[23]  Gordon E. Willmot,et al.  Monotonicity and aging properties of random sums , 2005 .

[24]  Oliver Johnson,et al.  Thinning and the Law of Small Numbers , 2007, 2007 IEEE International Symposium on Information Theory.