Entropy, compound Poisson approximation, log-Sobolev inequalities and measure concentration

The problem of approximating the distribution of a sum S/sub n/ = /spl Sigma//sub i=1//sup n/ Y/sub i/ of n discrete random variables Y/sub i/ by a Poisson or a compound Poisson distribution arises naturally in many classical and current applications, such as statistical genetics, dynamical systems, the recurrence properties of Markov processes and reliability theory. Using information-theoretic ideas and techniques, we derive a family of new bounds for compound Poisson approximation. We take an approach similar to that of Kontoyiannis, Harremoes and Johnson (2003), and we generalize some of their Poisson approximation bounds to the compound Poisson case. Partly motivated by these results, we derive a new logarithmic Sobolev inequality for the compound Poisson measure and use it to prove measure-concentration bounds for a large class of discrete distributions.