On the Oracle Property of the Adaptive Lasso in Stationary and Nonstationary Autoregressions

We show that the Adaptive LASSO is oracle efficient in stationary and non-stationary autoregressions. This means that it estimates parameters consistently, selects the correct sparsity pattern, and estimates the coefficients belonging to the relevant variables at the same asymptotic efficiency as if only these had been included in the model from the outset. In particular this implies that it is able to discriminate between stationary and non-stationary autoregressions and it thereby constitutes an addition to the set of unit root tests. However, it is also shown that the Adaptive LASSO has no power against shrinking alternatives of the form c/T where c is a constant and T the sample size if it is tuned to perform consistent model selection. We show that if the Adaptive LASSO is tuned to performed conservative model selection it has power even against shrinking alternatives of this form. Monte Carlo experiments reveal that the Adaptive LASSO performs particularly well in the presence of a unit root while being at par with its competitors in the stationary setting.

[1]  N. Meinshausen,et al.  High-dimensional graphs and variable selection with the Lasso , 2006, math/0608017.

[2]  Jianqing Fan,et al.  Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties , 2001 .

[3]  Jianqing Fan,et al.  Sure independence screening for ultrahigh dimensional feature space , 2006, math/0612857.

[4]  Terence Tao,et al.  The Dantzig selector: Statistical estimation when P is much larger than n , 2005, math/0506081.

[5]  P. Phillips Time series regression with a unit root , 1987 .

[6]  Peng Zhao,et al.  On Model Selection Consistency of Lasso , 2006, J. Mach. Learn. Res..

[7]  Anders Bredahl Kock,et al.  ORACLE EFFICIENT VARIABLE SELECTION IN RANDOM AND FIXED EFFECTS PANEL DATA MODELS , 2013, Econometric Theory.

[8]  Benedikt M. Pötscher,et al.  On the Distribution of Penalized Maximum Likelihood Estimators: The LASSO, SCAD, and Thresholding , 2007, J. Multivar. Anal..

[9]  Wenjiang J. Fu,et al.  Asymptotics for lasso-type estimators , 2000 .

[10]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[11]  J. Horowitz,et al.  Asymptotic properties of bridge estimators in sparse high-dimensional regression models , 2008, 0804.0693.

[12]  秀俊 松井,et al.  Statistics for High-Dimensional Data: Methods, Theory and Applications , 2014 .

[13]  H. Leeb,et al.  Sparse Estimators and the Oracle Property, or the Return of Hodges' Estimator , 2007, 0704.1466.

[14]  B. M. Pötscher,et al.  MODEL SELECTION AND INFERENCE: FACTS AND FICTION , 2005, Econometric Theory.

[15]  E. L. Lehmann,et al.  Theory of point estimation , 1950 .

[16]  Peter C. B. Phillips,et al.  Towards a Unified Asymptotic Theory for Autoregression , 1987 .

[17]  Chih-Ling Tsai,et al.  Regression coefficient and autoregressive order shrinkage and selection via the lasso , 2007 .

[18]  James D. Hamilton Time Series Analysis , 1994 .

[19]  H. Zou The Adaptive Lasso and Its Oracle Properties , 2006 .

[20]  K. Knight,et al.  An alternative to unit root tests: Bridge estimators differentiate between nonstationary versus stationary models and select optimal lag , 2013 .

[21]  P. Perron,et al.  Lag Length Selection and the Construction of Unit Root Tests with Good Size and Power , 2001 .

[22]  Sara van de Geer,et al.  Statistics for High-Dimensional Data: Methods, Theory and Applications , 2011 .