Penalized regression models with autoregressive error terms

Penalized regression methods have recently gained enormous attention in statistics and the field of machine learning due to their ability of reducing the prediction error and identifying important variables at the same time. Numerous studies have been conducted for penalized regression, but most of them are limited to the case when the data are independently observed. In this paper, we study a variable selection problem in penalized regression models with autoregressive (AR) error terms. We consider three estimators, adaptive least absolute shrinkage and selection operator, bridge, and smoothly clipped absolute deviation, and propose a computational algorithm that enables us to select a relevant set of variables and also the order of AR error terms simultaneously. In addition, we provide their asymptotic properties such as consistency, selection consistency, and asymptotic normality. The performances of the three estimators are compared with one another using simulated and real examples.

[1]  Jennifer A. Sinnott,et al.  High-Dimensional Regression Models , 2013 .

[2]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[3]  Hao Helen Zhang,et al.  Support vector machines with adaptive Lq penalty , 2007, Comput. Stat. Data Anal..

[4]  Xiaodong Lin,et al.  Gene expression Gene selection using support vector machines with non-convex penalty , 2005 .

[5]  H. Zou,et al.  Regularization and variable selection via the elastic net , 2005 .

[6]  Jianhua Z. Huang,et al.  Variable Selection in Nonparametric Varying-Coefficient Models for Analysis of Repeated Measurements , 2008, Journal of the American Statistical Association.

[7]  G. Casella,et al.  The Bayesian Lasso , 2008 .

[8]  Arthur E. Hoerl,et al.  Ridge Regression: Biased Estimation for Nonorthogonal Problems , 2000, Technometrics.

[9]  Nan-Jung Hsu,et al.  Subset selection for vector autoregressive processes using Lasso , 2008, Comput. Stat. Data Anal..

[10]  Robert Tibshirani,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition , 2001, Springer Series in Statistics.

[11]  Cheolwoo Park,et al.  Bridge regression: Adaptivity and group selection , 2011 .

[12]  Ruey S. Tsay Regression Models with Time Series Errors , 1984 .

[13]  Jianqing Fan,et al.  Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties , 2001 .

[14]  Pierre Alquier,et al.  Sparsity considerations for dependent variables , 2011, 1102.1615.

[15]  Wenjiang J. Fu Penalized Regressions: The Bridge versus the Lasso , 1998 .

[16]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[17]  H. Zou,et al.  The F ∞ -norm support vector machine , 2008 .

[18]  J. Horowitz,et al.  Asymptotic properties of bridge estimators in sparse high-dimensional regression models , 2008, 0804.0693.

[19]  Jianqing Fan,et al.  Variable Selection for Cox's proportional Hazards Model and Frailty Model , 2002 .

[20]  Hao Helen Zhang,et al.  ON THE ADAPTIVE ELASTIC-NET WITH A DIVERGING NUMBER OF PARAMETERS. , 2009, Annals of statistics.

[21]  J. Friedman,et al.  A Statistical View of Some Chemometrics Regression Tools , 1993 .

[22]  Chih-Ling Tsai,et al.  Regression coefficient and autoregressive order shrinkage and selection via the lasso , 2007 .

[23]  D. Pollard,et al.  Cube Root Asymptotics , 1990 .

[24]  D. Ruppert The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2004 .

[25]  H. Zou The Adaptive Lasso and Its Oracle Properties , 2006 .

[26]  Hao Helen Zhang,et al.  Adaptive Lasso for Cox's proportional hazards model , 2007 .

[27]  Wenjiang J. Fu,et al.  Asymptotics for lasso-type estimators , 2000 .

[28]  Robert Tibshirani,et al.  1-norm Support Vector Machines , 2003, NIPS.