Rates of convergence for minimum contrast estimators

SummaryWe shall present here a general study of minimum contrast estimators in a nonparametric setting (although our results are also valid in the classical parametric case) for independent observations. These estimators include many of the most popular estimators in various situations such as maximum likelihood estimators, least squares and other estimators of the regression function, estimators for mixture models or deconvolution... The main theorem relates the rate of convergence of those estimators to the entropy structure of the space of parameters. Optimal rates depending on entropy conditions are already known, at least for some of the models involved, and they agree with what we get for minimum contrast estimators as long as the entropy counts are not too large. But, under some circumstances (“large” entropies or changes in the entropy structure due to local perturbations), the resulting the rates are only suboptimal. Counterexamples are constructed which show that the phenomenon is real for non-parametric maximum likelihood or regression. This proves that, under purely metric assumptions, our theorem is optimal and that minimum contrast estimators happen to be suboptimal.

[1]  U. Grenander On the theory of mortality measurement , 1956 .

[2]  R. R. Bahadur Examples of inconsistency of maximum likelihood estimates , 1958 .

[3]  A. Kolmogorov,et al.  Entropy and "-capacity of sets in func-tional spaces , 1961 .

[4]  G. Lorentz Approximation of Functions , 1966 .

[5]  P. J. Huber The behavior of maximum likelihood estimates under nonstandard conditions , 1967 .

[6]  M. Birman,et al.  PIECEWISE-POLYNOMIAL APPROXIMATIONS OF FUNCTIONS OF THE CLASSES $ W_{p}^{\alpha}$ , 1967 .

[7]  J. Pfanzagl On the measurability and consistency of minimum contrast estimates , 1969 .

[8]  Prakasa Rao Estimation of a unimodal density , 1969 .

[9]  H. D. Brunk,et al.  Statistical inference under order restrictions : the theory and application of isotonic regression , 1973 .

[10]  L. Lecam Convergence of Estimates Under Dimensionality Restrictions , 1973 .

[11]  J. Kalbfleisch Statistical Inference Under Order Restrictions , 1975 .

[12]  R. Reiss Consistency of minimum contrast estimators in non-standard cases , 1978 .

[13]  C. J. Stone,et al.  Optimal Global Rates of Convergence for Nonparametric Regression , 1982 .

[14]  I. Ibragimov,et al.  Estimation of distribution density , 1983 .

[15]  Lucien Birgé Approximation dans les espaces métriques et théorie de l'estimation , 1983 .

[16]  P. Groeneboom Estimating a monotone density , 1984 .

[17]  R. Dudley A course on empirical processes , 1984 .

[18]  Lucien Birgé Stabilité et instabilité du risque minimax pour des variables indépendantes équidistribuées , 1984 .

[19]  R. Reiss Sharp rates of convergence of maximum likelihood estimators in nonparametric models , 1984 .

[20]  R. Bass Law of the iterated logarithm for set-indexed partial sum processes with finite variance , 1985 .

[21]  L. Birge,et al.  On estimating a density using Hellinger distance and some other strange facts , 1986 .

[22]  L. L. Cam,et al.  Asymptotic Methods In Statistical Decision Theory , 1986 .

[23]  J. Wellner,et al.  Empirical Processes with Applications to Statistics , 2009 .

[24]  M. Ossiander,et al.  A Central Limit Theorem Under Metric Entropy with $L_2$ Bracketing , 1987 .

[25]  E. Giné,et al.  The central limit theorem and the law of iterated logarithm for empirical processes under local conditions , 1988 .

[26]  S. Geer Estimating a Regression Function , 1990 .

[27]  L. Birge,et al.  The Grenader Estimator: A Nonasymptotic Approach , 1989 .

[28]  P. Massart The Tight Constant in the Dvoretzky-Kiefer-Wolfowitz Inequality , 1990 .

[29]  D. Donoho Statistical Estimation and Optimal Recovery , 1994 .