Hardest One-Dimensional Subproblems
暂无分享,去创建一个
For a long time, lower bounds on the difficulty of estimation have been constructed by showing that estimation was difficult even in certain 1-dimensional subproblems. The logical extension of this is to identify hardest one dimensional subproblems and to ask whether these are, either exactly or approximately, as difficult as the full problem. We do this in three settings: estimating linear functionals from observations with Gaussian noise, recovering linear functionals from observations with deterministic noise, and making confidence statements for linear functionals from observations with Gaussian noise. We show that the minimax value of the hardest subproblem is, in each case, equal to, or within a few percent of, the minimax value of the full problem. Sharpest known bounds on the asymptotic minimax risk and on the minimax confidence interval size follow from this approach. Also, new connections between statistical estimation and the theory of optimal recovery are established. For example, 95% confidence intervals based on estimators developed in the theory of optimal recovery are optimal among linear confidence procedures and within 19% of minimax among all procedures. Abbreviated Title: Hardest l-d subproblems AMS-MOS Subject Classifications Primary 62J05; secondary 62G35, 41A15.
[1] R. H. Farrell. On the Best Obtainable Asymptotic Rates of Convergence in Estimation of a Density Function at a Point , 1972 .
[2] C. J. Stone,et al. Optimal Rates of Convergence for Nonparametric Estimators , 1980 .
[3] P. Bickel. On Adaptive Estimation , 1982 .
[4] Henryk Wozniakowski,et al. Information, Uncertainty, Complexity , 1982 .