Stable direction recovery in single-index models with a diverging number of predictors

AbstractLarge dimensional predictors are often introduced in regressions to attenuate the possible modeling bias. We consider the stable direction recovery in single-index models in which we solely assume the response Y is independent of the diverging dimensional predictors X when β0TX is given, where β0 is a pn × 1 vector, and pn → ∞ as the sample size n → ∞. We first explore sufficient conditions under which the least squares estimation βn0 recovers the direction β0 consistently even when pn = o($$ \sqrt n $$). To enhance the model interpretability by excluding irrelevant predictors in regressions, we suggest an ℓ1-regularization algorithm with a quadratic constraint on magnitude of least squares residuals to search for a sparse estimation of β0. Not only can the solution βn of ℓ1-regularization recover β0 consistently, it also produces sufficiently sparse estimators which enable us to select “important” predictors to facilitate the model interpretation while maintaining the prediction accuracy. Further analysis by simulations and an application to the car price data suggest that our proposed estimation procedures have good finite-sample performance and are computationally efficient.

[1]  Chun-Houh Chen,et al.  CAN SIR BE AS POPULAR AS MULTIPLE LINEAR REGRESSION , 2003 .

[2]  H. Tong,et al.  Article: 2 , 2002, European Financial Services Law.

[3]  Ker-Chau Li,et al.  Sliced Inverse Regression for Dimension Reduction , 1991 .

[4]  Yingcun Xia,et al.  Variable selection for the single‐index model , 2007 .

[5]  Xiangrong Yin,et al.  Sliced Inverse Regression with Regularizations , 2008, Biometrics.

[6]  K. Liang,et al.  On the asymptotic behaviour of the pseudolikelihood ratio test statistic with boundary problems , 2010 .

[7]  R. Cook,et al.  Dimension reduction for conditional mean in regression , 2002 .

[8]  P. J. Huber Robust Regression: Asymptotics, Conjectures and Monte Carlo , 1973 .

[9]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[10]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[11]  Jianqing Fan,et al.  Nonconcave penalized likelihood with a diverging number of parameters , 2004, math/0406466.

[12]  Lexin Li,et al.  Tobit model estimation and sliced inverse regression , 2007 .

[13]  Prasad A. Naik,et al.  Controlling Measurement Errors in Models of Advertising Competition , 2000 .

[14]  R. Cook,et al.  Partial inverse regression , 2007 .

[15]  S. Rosen Hedonic Prices and Implicit Markets: Product Differentiation in Pure Competition , 1974, Journal of Political Economy.

[16]  Lexin Li,et al.  Sparse Sliced Inverse Regression , 2006, Technometrics.

[17]  R. Cook,et al.  Sufficient Dimension Reduction via Inverse Regression , 2005 .

[18]  Lixing Zhu,et al.  Dimension reduction for conditional variance in regressions , 2009 .

[19]  J. Tobin Estimation of Relationships for Limited Dependent Variables , 1958 .

[20]  Jianqing Fan,et al.  Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties , 2001 .

[21]  Qin Wang,et al.  A nonlinear multi-dimensional variable selection method for high dimensional data: Sparse MAVE , 2008, Comput. Stat. Data Anal..

[22]  P. M. E. Altham,et al.  Improving the Precision of Estimation by Fitting a Model , 1984 .

[23]  Liping Zhu,et al.  On distribution‐weighted partial least squares with diverging number of highly correlated predictors , 2009 .

[24]  Prasad A. Naik,et al.  Single‐index model selections , 2001 .

[25]  R. H. Moore,et al.  Regression Graphics: Ideas for Studying Regressions Through Graphics , 1998, Technometrics.

[26]  John C. Bergstrom,et al.  Measuring the Demand for Environmental Quality , 1991 .

[27]  Prasad A. Naik,et al.  A New Dimension Reduction Approach for Data-Rich Marketing Environments: Sliced Inverse Regression , 2000 .

[28]  Lexin Li,et al.  Sparse sufficient dimension reduction , 2007 .

[29]  Ker-Chau Li,et al.  On almost Linearity of Low Dimensional Projections from High Dimensional Data , 1993 .

[30]  D. Pollard Convergence of stochastic processes , 1984 .

[31]  R. Dennis Cook,et al.  A note on shrinkage sliced inverse regression , 2005 .

[32]  Ker-Chau Li Sliced inverse regression for dimension reduction (with discussion) , 1991 .