In this paper we establish lower bounds on informa- tion divergence from a distribution to certain important classes of distributions as Gaussian, exponential, Gamma, Poisson, ge- ometric, and binomial. These lower bounds are tight and for several convergence theorems where a rate of convergence can be computed, this rate is determined by the lower bounds proved in this paper. General techniques for getting lower bounds in terms of moments are developed. I. I NTRODUCTION AND NOTATIONS In 2004, O. Johnson and A. Barron have proved (1) that the rate of convergence in the information theoretic Central Limit Theorem is upper bounded by c=n under suitable conditions. P. Harremoes extended this work in (2) based on a maximum entropy approach. Similar results have been obtained for the convergence of binomial distributions to Poisson distributions. Finally the rate of convergence of convolutions of distribu tions on the unit circle toward the uniform distribution can be bounded. In each of these cases lower bounds on information divergence in terms of moments of orthogonal polynomials or trigonometric functions give lower bounds on the rate of convergence. In this paper, we provide more lower bounds on information divergence using mainly orthogonal polynomials and the related exponential families. We will identify x! with ( x + 1) even when x is not an integer. Similarly the generalized binomial coefficient x � equals x(x − 1)���(x − n + 1)=n! when x is not an integer. We useas short for 2� .
[1]
P. C. Consul,et al.
A Generalized Negative Binomial Distribution
,
1971
.
[2]
Peter Harremoës.
Lower Bounds for Divergence in Central Limit Theorem
,
2005,
Electron. Notes Discret. Math..
[3]
P. Consul,et al.
A Generalization of the Poisson Distribution
,
1973
.
[4]
Gérard Letac,et al.
Natural Real Exponential Families with Cubic Variance Functions
,
1990
.
[5]
C. Morris.
Natural Exponential Families with Quadratic Variance Functions
,
1982
.
[6]
Oliver Johnson,et al.
Thinning and information projections
,
2008,
2008 IEEE International Symposium on Information Theory.
[7]
Richard Askey,et al.
Convolution structures for Laguerre polynomials
,
1977
.
[8]
A. Barron,et al.
Fisher information inequalities and the central limit theorem
,
2001,
math/0111020.
[9]
Takemi Yanagimoto,et al.
The inverse binomial distribution as a statistical model
,
1989
.