Let an estimated function belong to a Lipschitz class of order a. Consider a minimax approach where the infimum is taken over all possible estimators and the supremum is taken over the considered class of estimated functions. It is known that, if the order a is unknown, then the minimax mean squared (pointwise) error convergence slows down from n-2a/(2a+1) for the case of the given a to [n/In (n)]-2a/(2a+l). At the same time, the minimax mean integrated squared (global) error convergence is proportional to n-2a/(2a+l) for the cases of known and unknown a. We show that a similar phenomenon holds for analytic functions where the lack of knowledge of the maximal set to which the function can be analytically continued leads to the loss of a sharp constant. Surprisingly, for the more general adaptive minimax setting where we consider the union of a range of Lipschitz and a range of analytic functions neither pointwise error convergence nor global error convergence suffers an additional slowing down.
[1]
P. Massart,et al.
From Model Selection to Adaptive Estimation
,
1997
.
[2]
M. Nussbaum.
Asymptotic Equivalence of Density Estimation and Gaussian White Noise
,
1996
.
[3]
L. Brown,et al.
Asymptotic equivalence of nonparametric regression and white noise
,
1996
.
[4]
Sam Efromovich,et al.
Adaptive estimates of linear functionals
,
1994
.
[5]
A. Timan.
Theory of Approximation of Functions of a Real Variable
,
1994
.
[6]
L. Devroye,et al.
Nonparametric Density Estimation: The L 1 View.
,
1985
.
[7]
I. Ibragimov,et al.
On Nonparametric Estimation of the Value of a Linear Functional in Gaussian White Noise
,
1985
.
[8]
R. Z. Khasʹminskiĭ,et al.
Statistical estimation : asymptotic theory
,
1981
.
[9]
N. Bary,et al.
Treatise of Trigonometric Series
,
1966
.