ASYMPTOTIC PROPERTIES OF SUFFICIENT DIMENSION REDUCTION WITH A DIVERGING NUMBER OF PREDICTORS.

We investigate asymptotic properties of a family of sufficient dimension reduction estimators when the number of predictors p diverges to infinity with the sample size. We adopt a general formulation of dimension reduction estimation through least squares regression of a set of transformations of the response. This formulation allows us to establish the consistency of reduction projection estimation. We then introduce the SCAD max penalty, along with a difference convex optimization algorithm, to achieve variable selection. We show that the penalized estimator selects all truly relevant predictors and excludes all irrelevant ones with probability approaching one, meanwhile it maintains consistent reduction basis estimation for relevant predictors. Our work differs from most model-based selection methods in that it does not require a traditional model, and it extends existing sufficient dimension reduction and model-free variable selection approaches from the fixed p scenario to a diverging p.

[1]  Runze Li,et al.  Tuning parameter selectors for the smoothly clipped absolute deviation method. , 2007, Biometrika.

[2]  Jeffrey S. Morris,et al.  Sure independence screening for ultrahigh dimensional feature space Discussion , 2008 .

[3]  Jianqing Fan,et al.  Nonconcave penalized likelihood with a diverging number of parameters , 2004, math/0406466.

[4]  R. Dennis Cook,et al.  Estimating central subspaces via inverse third moments , 2003 .

[5]  R. Cook,et al.  Using intraslice covariances for improved estimation of the central subspace in regression , 2006 .

[6]  W. Fung,et al.  DIMENSION REDUCTION BASED ON CANONICAL CORRELATION , 2002 .

[7]  Jianqing Fan,et al.  Variable Selection for Cox's proportional Hazards Model and Frailty Model , 2002 .

[8]  Jérôme Saracco,et al.  An asymptotic theory for sliced inverse regression , 1997 .

[9]  Peng Zeng,et al.  RSIR: regularized sliced inverse regression for motif discovery , 2005, Bioinform..

[10]  Lixing Zhu,et al.  Asymptotics of sliced inverse regression , 1995 .

[11]  Wenjiang J. Fu,et al.  Asymptotics for lasso-type estimators , 2000 .

[12]  R. Dennis Cook,et al.  A note on shrinkage sliced inverse regression , 2005 .

[13]  D. Hunter,et al.  Variable Selection using MM Algorithms. , 2005, Annals of statistics.

[14]  S. Portnoy Asymptotic Behavior of $M$-Estimators of $p$ Regression Parameters when $p^2/n$ is Large. I. Consistency , 1984 .

[15]  Runze Li,et al.  Variable Selection in Semiparametric Regression Modeling. , 2008, Annals of statistics.

[16]  Yingcun Xia,et al.  Sliced Regression for Dimension Reduction , 2008 .

[17]  H. Zou,et al.  One-step Sparse Estimates in Nonconcave Penalized Likelihood Models. , 2008, Annals of statistics.

[18]  R. Cook,et al.  Dimension reduction for the conditional kth moment in regression , 2002 .

[19]  H. Zha,et al.  Contour regression: A general approach to dimension reduction , 2005, math/0508277.

[20]  Yufeng Liu,et al.  VARIABLE SELECTION IN QUANTILE REGRESSION , 2009 .

[21]  Ker-Chau Li,et al.  Sliced Inverse Regression for Dimension Reduction , 1991 .

[22]  Robert H. Halstead,et al.  Matrix Computations , 2011, Encyclopedia of Parallel Computing.

[23]  Shaoli Wang,et al.  On Directional Regression for Dimension Reduction , 2007 .

[24]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[25]  Gene H. Golub,et al.  Matrix computations (3rd ed.) , 1996 .

[26]  R. Cook,et al.  Theory & Methods: Special Invited Paper: Dimension Reduction and Visualization in Discriminant Analysis (with discussion) , 2001 .

[27]  Le Thi Hoai An,et al.  Solving a Class of Linearly Constrained Indefinite Quadratic Problems by D.C. Algorithms , 1997, J. Glob. Optim..

[28]  Jianqing Fan,et al.  Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties , 2001 .

[29]  Xuming He,et al.  Dimension reduction based on constrained canonical correlation and variable filtering , 2008, 0808.0977.

[30]  Ker-Chau Li Sliced inverse regression for dimension reduction (with discussion) , 1991 .

[31]  James M. Ortega,et al.  Iterative solution of nonlinear equations in several variables , 2014, Computer science and applied mathematics.

[32]  Jianqing Fan,et al.  Sure independence screening for ultrahigh dimensional feature space , 2006, math/0612857.

[33]  K. Fang,et al.  Asymptotics for kernel estimate of sliced inverse regression , 1996 .

[34]  R. Cook Graphics for regressions with a binary response , 1996 .

[35]  Ker-Chau Li,et al.  On almost Linearity of Low Dimensional Projections from High Dimensional Data , 1993 .

[36]  Liping Zhu,et al.  On distribution‐weighted partial least squares with diverging number of highly correlated predictors , 2009 .

[37]  R. Dennis Cook,et al.  Testing predictor contributions in sufficient dimension reduction , 2004, math/0406520.

[38]  H. Zou The Adaptive Lasso and Its Oracle Properties , 2006 .

[39]  L. Breiman Better subset regression using the nonnegative garrote , 1995 .

[40]  H. Hotelling Relations Between Two Sets of Variates , 1936 .

[41]  Lixing Zhu,et al.  On Sliced Inverse Regression With High-Dimensional Covariates , 2006 .

[42]  Li-Xing Zhu,et al.  Nonconcave penalized inverse regression in single-index models with high dimensional predictors , 2009, J. Multivar. Anal..

[43]  R. H. Moore,et al.  Regression Graphics: Ideas for Studying Regressions Through Graphics , 1998, Technometrics.

[44]  Bing Li,et al.  Successive direction extraction for estimating the central subspace in a multiple-index regression , 2008 .

[45]  Howard D. Bondell,et al.  Shrinkage inverse regression estimation for model‐free variable selection , 2009 .

[46]  Y. Xia A constructive approach to the estimation of dimension reduction directions , 2007, math/0701761.