Direct asymptotic equivalence of nonparametric regression and the infinite dimensional location problem

We begin with a random design nonparametric regression having random predictors and Gaussian errors. We produce a convenient, easily implementable mapping of this problem into a Gaussian infinite dimensional location problem. Such an infinite dimensional problem can reflect a Fourier, or wavelet, or other orthogonal basis representation of the original regression situation. In this way it may be easier to analyze than the original regression formulation. There is considerable literature on doing this; beyond describing the situation we do not pursue here this issue of the analysis of such infinite dimensional models. For most of our results the random regressors in our theory may have either a known or unknown distribution. The correspondence we produce between the regression and location problems is an asymptotic equivalence mapping. (We also explicitly describe the converse mapping from the location problem to the regression.) Thus any solution to a statistical problem in one formulation can be easily converted to a solution for the other formulation. The basic mapping from the regression to location formulations involves a few steps. First, bin the regression observations and use the bin averages to compute an empirical infinite series transform. Then truncate this series appropriately. Add a small amount of prescribed Gaussian noise to the truncated series coefficients. Then use a subset of these to linearly predict the remaining tail coordinates of the infinite series. In many applications the latter two steps are not necessary even though they are needed for an explicit asymptotic equivalence mapping.

[1]  Le Cam,et al.  On some asymptotic properties of maximum likelihood estimates and related Bayes' estimates , 1953 .

[2]  M. Rosenblatt Remarks on Some Nonparametric Estimates of a Density Function , 1956 .

[3]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .

[4]  E. Nadaraya On Estimating Regression , 1964 .

[5]  G. S. Watson,et al.  Smooth regression analysis , 1964 .

[6]  L. L. Cam,et al.  Sufficiency and Approximate Sufficiency , 1964 .

[7]  H. Kuo Gaussian Measures in Banach Spaces , 1975 .

[8]  All Admissible Linear Estimators of the Mean of a Gaussian Distribution on a Hilbert Space , 1984 .

[9]  L. L. Cam,et al.  Asymptotic Methods In Statistical Decision Theory , 1986 .

[10]  Grace Wahba,et al.  Spline Models for Observational Data , 1990 .

[11]  D. Donoho,et al.  Minimax Risk Over Hyperrectangles, and Implications , 1990 .

[12]  R. Eubank,et al.  Convergence rates for trigonometric and polynomial-trigonometric regression estimators , 1991 .

[13]  A. Antoniadis,et al.  Wavelet Methods for Curve Estimation , 1994 .

[14]  D. Donoho Statistical Estimation and Optimal Recovery , 1994 .

[15]  Wolfgang Härdle,et al.  Applied Nonparametric Regression , 1991 .

[16]  I. Johnstone,et al.  Adapting to Unknown Smoothness via Wavelet Shrinkage , 1995 .

[17]  L. Brown,et al.  Asymptotic equivalence of nonparametric regression and white noise , 1996 .

[18]  I. Johnstone,et al.  Minimax estimation via wavelet shrinkage , 1998 .

[19]  Cun-Hui Zhang,et al.  Asymptotic nonequivalence of nonparametric experiments when the smoothness index is , 1998 .

[20]  H. Müller,et al.  Local Polynomial Modeling and Its Applications , 1998 .

[21]  I. Johnstone,et al.  ASYMPTOTIC MINIMAXITY OF WAVELET ESTIMATORS WITH SAMPLED DATA , 1999 .

[22]  T. Cai Adaptive wavelet estimation : A block thresholding and oracle inequality approach , 1999 .

[23]  J. Steele Stochastic Calculus and Financial Applications , 2000 .

[24]  Guohua Pan,et al.  Local Regression and Likelihood , 1999, Technometrics.

[25]  Lianfen Qian,et al.  Nonparametric Curve Estimation: Methods, Theory, and Applications , 1999, Technometrics.