Parameter tuning in pointwise adaptation using a propagation approach

This paper discusses the problem of adaptive estimation of a univariate object like the value of a regression function at a given point or a linear functional in a linear inverse problem. We consider an adaptive procedure originated from Lepski [Theory Probab. Appl. 35 (1990) 454-466.] that selects in a data-driven way one estimate out of a given class of estimates ordered by their variability. A serious problem with using this and similar procedures is the choice of some tuning parameters like thresholds. Numerical results show that the theoretically recommended proposals appear to be too conservative and lead to a strong oversmoothing effect. A careful choice of the parameters of the procedure is extremely important for getting the reasonable quality of estimation. The main contribution of this paper is the new approach for choosing the parameters of the procedure by providing the prescribed behavior of the resulting estimate in the simple parametric situation. We establish a non-asymptotical "oracle" bound, which shows that the estimation risk is, up to a logarithmic multiplier, equal to the risk of the "oracle" estimate that is optimally selected from the given family. A numerical study demonstrates a good performance of the resulting procedure in a number of simulated examples.

[1]  V. Spokoiny,et al.  Optimal pointwise adaptive methods in nonparametric estimation , 1997 .

[2]  P. Massart,et al.  Rates of convergence for minimum contrast estimators , 1993 .

[3]  O. Lepskii Asymptotically Minimax Adaptive Estimation. I: Upper Bounds. Optimally Adaptive Estimates , 1992 .

[4]  O. Lepskii,et al.  Asymptotically minimax adaptive estimation. II: Schemes without optimal adaptation: adaptive estimators , 1993 .

[5]  F. Bauer,et al.  Some considerations concerning regularization and parameter choice algorithms , 2007 .

[6]  L. Birge,et al.  Model selection via testing: an alternative to (penalized) maximum likelihood estimators , 2006 .

[7]  A. Juditsky,et al.  Learning by mirror averaging , 2005, math/0511468.

[8]  Laurent Cavalier,et al.  On the problem of local adaptive estimation in tomography , 2001 .

[9]  Jan Johannes,et al.  Adaptive estimation of linear functionals in functional linear models , 2011 .

[10]  Sergei V. Pereverzev,et al.  On adaptive inverse estimation of linear functionals in Hilbert scales , 2003 .

[11]  O. Lepskii On a Problem of Adaptive Estimation in Gaussian White Noise , 1991 .

[12]  Alexander Goldenshluger,et al.  Nonparametric Estimation of Transfer Functions: Rates of Convergence and Adaptation , 1998, IEEE Trans. Inf. Theory.

[13]  T. Tony Cai,et al.  Prediction in functional linear regression , 2006 .

[14]  A. Goldenshluger,et al.  Adaptive estimation of linear functionals in Hilbert scales from indirect white noise observations , 2000 .

[15]  P. Massart,et al.  Minimum contrast estimators on sieves: exponential bounds and rates of convergence , 1998 .

[16]  E. Mammen,et al.  Optimal spatial adaptation to inhomogeneous smoothness: an approach based on kernel estimates with variable bandwidth selectors , 1997 .

[17]  G. K. Golubev,et al.  The Method of Risk Envelope in Estimation of Linear Functionals , 2004, Probl. Inf. Transm..

[18]  A. Goldenshluger On pointwise adaptive nonparametric deconvolution , 1999 .