Tuning parameter selectors for the smoothly clipped absolute deviation method.

The penalised least squares approach with smoothly clipped absolute deviation penalty has been consistently demonstrated to be an attractive regression shrinkage and selection method. It not only automatically and consistently selects the important variables, but also produces estimators which are as efficient as the oracle estimator. However, these attractive features depend on appropriately choosing the tuning parameter. We show that the commonly used the generalised crossvalidation cannot select the tuning parameter satisfactorily, with a nonignorable overfitting effect in the resulting model. In addition, we propose a bic tuning parameter selector, which is shown to be able to identify the true model consistently. Simulation studies are presented to support theoretical findings, and an empirical example is given to illustrate its use in the Female Labor Supply data.

[1]  H. Akaike,et al.  Information Theory and an Extension of the Maximum Likelihood Principle , 1973 .

[2]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[3]  Peter Craven,et al.  Smoothing noisy data with spline functions , 1978 .

[4]  B. Silverman,et al.  Weak and strong uniform consistency of kernel regression estimates , 1982 .

[5]  Nancy E. Heckman,et al.  Spline Smoothing in a Partly Linear Model , 1986 .

[6]  A. A. Weiss,et al.  Semiparametric estimates of the relation between weather and electricity sales , 1986 .

[7]  P. Robinson ROOT-N-CONSISTENT SEMIPARAMETRIC REGRESSION , 1988 .

[8]  P. Speckman Kernel smoothing in partial linear models , 1988 .

[9]  Jianqing Fan,et al.  Local polynomial modelling and its applications , 1994 .

[10]  M. Wand,et al.  An Effective Bandwidth Selector for Local Least Squares Regression , 1995 .

[11]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[12]  L. Breiman Heuristics of instability and stabilization in model selection , 1996 .

[13]  Adonis Yatchew,et al.  An elementary estimator of the partial linear model , 1997 .

[14]  J. Shao AN ASYMPTOTIC THEORY FOR LINEAR MODEL SELECTION , 1997 .

[15]  Wolfgang Härdle,et al.  Direct estimation of low-dimensional components in additive models , 1998 .

[16]  A. McQuarrie,et al.  Regression and Time Series Model Selection , 1998 .

[17]  Wolfgang Härdle,et al.  Partially Linear Models , 2000 .

[18]  Jianqing Fan,et al.  Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties , 2001 .

[19]  Chih-Ling Tsai,et al.  Regression model selection—a residual likelihood approach , 2002 .

[20]  Yuhong Yang Can the Strengths of AIC and BIC Be Shared , 2005 .

[21]  Florentina Bunea,et al.  Two‐stage model selection procedures in partially linear regression , 2004 .

[22]  Florentina Bunea Consistent covariate selection and post model selection inference in semiparametric regression , 2004 .

[23]  Chih-Ling Tsai,et al.  A Joint Regression Variable and Autoregressive Order Selection Criterion , 2004 .

[24]  Jianqing Fan,et al.  New Estimation and Model Selection Procedures for Semiparametric Modeling in Longitudinal Data Analysis , 2004 .

[25]  Jianhua Z. Huang,et al.  Identification of non‐linear additive autoregressive models , 2004 .

[26]  Jianqing Fan,et al.  Profile likelihood inferences on semiparametric varying-coefficient partially linear models , 2005 .