Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O.T. Johnson, Log-concavity and the maximum entropy property of the Poisson distribution, Stochastic Process. Appl., 117(6) (2007) 791-802] used a semigroup approach to show that the Poisson has maximal entropy among all ultra-log-concave distributions with fixed mean. We show via a non-trivial extension of this semigroup approach that the natural analog of the Poisson maximum entropy property remains valid if the compound Poisson distributions under consideration are log-concave, but that it fails in general. A parallel maximum entropy result is established for the family of compound binomial measures. Sufficient conditions for compound distributions to be log-concave are discussed and applications to combinatorics are examined; new bounds are derived on the entropy of the cardinality of a random independent set in a claw-free graph, and a connection is drawn to Mason's conjecture for matroids. The present results are primarily motivated by the desire to provide an information-theoretic foundation for compound Poisson approximation and associated limit theorems, analogous to the corresponding developments for the central limit theorem and for Poisson approximation. Our results also demonstrate new links between some probabilistic methods and the combinatorial notions of log-concavity and ultra-log-concavity, and they add to the growing body of work exploring the applications of maximum entropy characterizations to problems in discrete mathematics.

[1]  D. Wagner,et al.  Negatively Correlated Random Variables and Mason’s Conjecture for Independent Sets in Matroids , 2006, math/0602648.

[2]  Philippe Flajolet,et al.  Singularity Analysis and Asymptotics of Bernoulli Sums , 1999, Theor. Comput. Sci..

[3]  J. Keilson,et al.  Uniform stochastic ordering and related inequalities , 1982 .

[4]  Mokshay M. Madiman,et al.  Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.

[5]  O. Johnson Log-concavity and the maximum entropy property of the Poisson distribution , 2006, math/0603647.

[6]  Jaikumar Radhakrishnan An Entropy Proof of Bregman's Theorem , 1997, J. Comb. Theory, Ser. A.

[7]  Yaming Yu,et al.  On the Entropy of Compound Distributions on Nonnegative Integers , 2009, IEEE Transactions on Information Theory.

[8]  Bjorn G. Hansen,et al.  On Log-Concave and Log-Convex Infinitely Divisible Sequences and Densities , 1988 .

[9]  Mathew D. Penrose,et al.  Random Geometric Graphs , 2003 .

[10]  Zhen Zhang,et al.  On the maximum entropy of the sum of two dependent random variables , 1994, IEEE Trans. Inf. Theory.

[11]  Mokshay M. Madiman,et al.  Compound Poisson Approximation via Information Functionals , 2010, ArXiv.

[12]  T. E. Harris A lower bound for the critical probability in a certain percolation process , 1960, Mathematical Proceedings of the Cambridge Philosophical Society.

[13]  Charles Knessl,et al.  Integral representations and asymptotic expansions for Shannon and Renyi entropies , 1998 .

[14]  C. Fortuin,et al.  Correlation inequalities on some partially ordered sets , 1971 .

[15]  K. Ball,et al.  Solution of Shannon's problem on the monotonicity of entropy , 2004 .

[16]  Oliver Johnson,et al.  Entropy and the law of small numbers , 2005, IEEE Transactions on Information Theory.

[17]  Yeong-Nan Yeh,et al.  Log-concavity and LC-positivity , 2007, J. Comb. Theory, Ser. A.

[18]  Harry H. Panjer,et al.  Recursive Evaluation of a Family of Compound Distributions , 1981, ASTIN Bulletin.

[19]  Peter Harremoës,et al.  Binomial and Poisson distributions as maximum entropy distributions , 2001, IEEE Trans. Inf. Theory.

[20]  Philippe Jacquet,et al.  Entropy Computations via Analytic Depoissonization , 1999, IEEE Trans. Inf. Theory.

[21]  R. Adelson Compound Poisson Distributions , 1966 .

[22]  G. Markowsky,et al.  On Dedekind’s problem: the number of isotone Boolean functions. II , 1975 .

[23]  Ingram Olkin,et al.  Entropy of the Sum of Independent Bernoulli Random Variables and of the Multinomial Distribution , 1981 .

[24]  B. Gnedenko,et al.  Random Summation: Limit Theorems and Applications , 1996 .

[25]  Thomas M. Liggett Ultra Logconcave Sequences and Negative Dependence , 1997, J. Comb. Theory, Ser. A.

[26]  Daniel J. Kleitman,et al.  The number of linear extensions of subset ordering , 1987, Discret. Math..

[27]  Antonia Maria Tulino,et al.  Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.

[28]  Prasad Tetali,et al.  The Number of Linear Extensions of the Boolean Lattice , 2003, Order.

[29]  J. Cooper TOTAL POSITIVITY, VOL. I , 1970 .

[30]  Paul D. Seymour,et al.  The roots of the independence polynomial of a clawfree graph , 2007, J. Comb. Theory B.

[31]  Gordon E. Willmot,et al.  Monotonicity and aging properties of random sums , 2005 .

[32]  Louis H. Y. Chen,et al.  Compound Poisson Approximation for Nonnegative Random Variables Via Stein's Method , 1992 .

[33]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[34]  Jeff Kahn,et al.  Negative correlation and log-concavity , 2010 .

[35]  A. Barron ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .

[36]  Fw Fred Steutel,et al.  Discrete analogues of self-decomposability and stability , 1979 .

[37]  Oliver Johnson,et al.  Thinning and the Law of Small Numbers , 2007, 2007 IEEE International Symposium on Information Theory.

[38]  D. Kleitman Families of Non-disjoint subsets* , 1966 .

[39]  Oliver Johnson,et al.  Preservation of log-concavity on summation , 2005, math/0502548.

[40]  Leonid Gurvits A Short Proof, Based on Mixed Volumes, of Liggett's Theorem on the Convolution of Ultra-Logconcave Sequences , 2009, Electron. J. Comb..

[41]  J. Kahn Entropy, independent sets and antichains: A new approach to Dedekind's problem , 2001 .

[42]  S. K. Katti Infinite Divisibility of Integer-Valued Random Variables , 1967 .

[43]  R. Stanley Log‐Concave and Unimodal Sequences in Algebra, Combinatorics, and Geometry a , 1989 .

[44]  June Huh,et al.  Log-concavity of characteristic polynomials and the Bergman fan of matroids , 2011, 1104.2519.

[45]  F. Brenti,et al.  Unimodal, log-concave and Pólya frequency sequences in combinatorics , 1989 .

[46]  J. Gani,et al.  Contributions to Probability. , 1984 .

[47]  R. Pemantle Towards a theory of negative dependence , 2000, math/0404095.

[48]  Nelson De Pril Recursions for Convolutions of Arithmetic Distributions , 1985 .

[49]  Mokshay M. Madiman,et al.  On the entropy and log-concavity of compound Poisson measures , 2008, ArXiv.

[50]  Jeff Kahn,et al.  A strong log-concavity property for measures on Boolean algebras , 2009, J. Comb. Theory, Ser. A.

[51]  Yahya Ould Hamidoune On the numbers of independent k-sets in a claw free graph , 1990, J. Comb. Theory, Ser. B.

[52]  T. Liggett,et al.  Negative dependence and the geometry of polynomials , 2007, 0707.2340.

[53]  J. Keilson,et al.  Some Results for Discrete Unimodality , 1971 .

[54]  Djalil CHAFAÏ BINOMIAL-POISSON ENTROPIC INEQUALITIES AND THE M/M/∞ QUEUE , 2006 .

[55]  Sergey G. Bobkov,et al.  The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions , 2010, IEEE Transactions on Information Theory.

[56]  O. Johnson Information Theory And The Central Limit Theorem , 2004 .

[57]  Mokshay M. Madiman,et al.  Entropy and set cardinality inequalities for partition‐determined functions , 2008, Random Struct. Algorithms.

[58]  P. Mateev On the Entropy of the Multinomial Distribution , 1978 .