Partial inverse regression

In regression with a vector of quantitative predictors, sufficient dimension reduction methods can effectively reduce the predictor dimension, while preserving full regression information and assuming no parametric model. However, all current reduction methods require the sample size n to be greater than the number of predictors p. It is well known that partial least squares can deal with problems with n < p. We first establish a link between partial least squares and sufficient dimension reduction. Motivated by this link, we then propose a new dimension reduction method, entitled partial inverse regression. We show that its sample estimator is consistent, and that its performance is similar to or superior to partial least squares when n < p, especially when the regression model is nonlinear or heteroscedastic. An example involving the spectroscopy analysis of biscuit dough is also given. Copyright 2007, Oxford University Press.

[1]  G. Wahba Smoothing noisy data with spline functions , 1975 .

[2]  T. Fearn,et al.  Application of near infrared reflectance spectroscopy to the compositional analysis of biscuits and biscuit doughs , 1984 .

[3]  M. L. Eaton A characterization of spherical distributions , 1986 .

[4]  I. Helland ON THE STRUCTURE OF PARTIAL LEAST SQUARES REGRESSION , 1988 .

[5]  Ker-Chau Li,et al.  Regression Analysis Under Link Violation , 1989 .

[6]  M. Stone Continuum regression: Cross-validated sequentially constructed prediction embracing ordinary least s , 1990 .

[7]  I. Helland Partial least squares regression and statistical models , 1990 .

[8]  Ker-Chau Li,et al.  Sliced Inverse Regression for Dimension Reduction , 1991 .

[9]  I. Helland Maximum likelihood regression on relevant components , 1992 .

[10]  W. Härdle,et al.  Optimal Smoothing in Single-index Models , 1993 .

[11]  Inge S. Helland,et al.  Relevant components in regression , 1993 .

[12]  Ker-Chau Li,et al.  On almost Linearity of Low Dimensional Projections from High Dimensional Data , 1993 .

[13]  Jianqing Fan,et al.  Local polynomial modelling and its applications , 1994 .

[14]  B. Silverman,et al.  Nonparametric Regression and Generalized Linear Models: A roughness penalty approach , 1993 .

[15]  I. Helland,et al.  Comparison of Prediction Methods when Only a Few Components are Relevant , 1994 .

[16]  P. Garthwaite An Interpretation of Partial Least Squares , 1994 .

[17]  R. Cook Graphics for regressions with a binary response , 1996 .

[18]  Jianqing Fan,et al.  Efficient Estimation of Conditional Variance Functions in Stochastic Regression , 1998 .

[19]  R. H. Moore,et al.  Regression Graphics: Ideas for Studying Regressions Through Graphics , 1998, Technometrics.

[20]  Prasad A. Naik,et al.  Partial least squares estimator for single‐index models , 2000 .

[21]  T. Fearn,et al.  Bayesian Wavelet Regression on Curves With Application to a Spectroscopic Calibration Problem , 2001 .

[22]  Michel Wedel,et al.  Challenges and opportunities in high-dimensional choice data analyses , 2008 .