Regularization and variable selection via the elastic net

Summary.  We propose the elastic net, a new regularization and variable selection method. Real world data and a simulation study show that the elastic net often outperforms the lasso, while enjoying a similar sparsity of representation. In addition, the elastic net encourages a grouping effect, where strongly correlated predictors tend to be in or out of the model together. The elastic net is particularly useful when the number of predictors (p) is much bigger than the number of observations (n). By contrast, the lasso is not a very satisfactory variable selection method in the p≫n case. An algorithm called LARS‐EN is proposed for computing elastic net regularization paths efficiently, much like algorithm LARS does for the lasso.

[1]  J. Friedman Regularized Discriminant Analysis , 1989 .

[2]  T. Stamey,et al.  Prostate specific antigen in the diagnosis and treatment of adenocarcinoma of the prostate. II. Radical prostatectomy treated patients. , 1989, The Journal of urology.

[3]  Hannu Oja,et al.  Bivariate Sign Tests , 1989 .

[4]  J. Friedman,et al.  A Statistical View of Some Chemometrics Regression Tools , 1993 .

[5]  J. Friedman,et al.  [A Statistical View of Some Chemometrics Regression Tools]: Response , 1993 .

[6]  I. Johnstone,et al.  Wavelet Shrinkage: Asymptopia? , 1995 .

[7]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[8]  L. Breiman Heuristics of instability and stabilization in model selection , 1996 .

[9]  Wenjiang J. Fu Penalized Regressions: The Bridge versus the Lasso , 1998 .

[10]  J. Mesirov,et al.  Molecular classification of cancer: class discovery and class prediction by gene expression monitoring. , 1999, Science.

[11]  R. Tibshirani,et al.  Supervised harvesting of expression trees , 2001, Genome Biology.

[12]  M. R. Osborne,et al.  A new approach to variable selection in least squares problems , 2000 .

[13]  Ash A. Alizadeh,et al.  'Gene shaving' as a method for identifying distinct sets of genes with similar expression patterns , 2000, Genome Biology.

[14]  R. Tibshirani,et al.  Significance analysis of microarrays applied to the ionizing radiation response , 2001, Proceedings of the National Academy of Sciences of the United States of America.

[15]  Robert Tibshirani,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2001, Springer Series in Statistics.

[16]  Jianqing Fan,et al.  Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties , 2001 .

[17]  R. Spang,et al.  Predicting the clinical status of human breast cancer by using gene expression profiles , 2001, Proceedings of the National Academy of Sciences of the United States of America.

[18]  R. Tibshirani,et al.  Diagnosis of multiple cancer types by shrunken centroids of gene expression , 2002, Proceedings of the National Academy of Sciences of the United States of America.

[19]  Michael I. Jordan,et al.  Discussion of Boosting Papers , 2003 .

[20]  Tong Zhang Statistical behavior and consistency of classification methods based on convex risk minimization , 2003 .

[21]  Kam D. Dahlquist,et al.  Regression Approaches for Microarray Data Analysis , 2002, J. Comput. Biol..

[22]  T. Hastie,et al.  Classification of gene microarrays by penalized logistic regression. , 2004, Biostatistics.

[23]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[24]  R. Díaz-Uriarte A simple method for finding molecular signatures from gene expression data , 2004 .

[25]  D. Ruppert The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2004 .

[26]  Peter Bühlmann,et al.  Finding predictive gene groups from microarray data , 2004 .

[27]  Ji Zhu,et al.  Boosting as a Regularized Path to a Maximum Margin Classifier , 2004, J. Mach. Learn. Res..

[28]  H. Zou,et al.  Addendum: Regularization and variable selection via the elastic net , 2005 .

[29]  R. Tibshirani,et al.  Sparse Principal Component Analysis , 2006 .