On global and pointwise adaptive estimation

Let an estimated function belong to a Lipschitz class of order a. Consider a minimax approach where the infimum is taken over all possible estimators and the supremum is taken over the considered class of estimated functions. It is known that, if the order a is unknown, then the minimax mean squared (pointwise) error convergence slows down from n-2a/(2a+1) for the case of the given a to [n/In (n)]-2a/(2a+l). At the same time, the minimax mean integrated squared (global) error convergence is proportional to n-2a/(2a+l) for the cases of known and unknown a. We show that a similar phenomenon holds for analytic functions where the lack of knowledge of the maximal set to which the function can be analytically continued leads to the loss of a sharp constant. Surprisingly, for the more general adaptive minimax setting where we consider the union of a range of Lipschitz and a range of analytic functions neither pointwise error convergence nor global error convergence suffers an additional slowing down.