Bayesian bridge regression

ABSTRACT Classical bridge regression is known to possess many desirable statistical properties such as oracle, sparsity, and unbiasedness. One outstanding disadvantage of bridge regularization, however, is that it lacks a systematic approach to inference, reducing its flexibility in practical applications. In this study, we propose bridge regression from a Bayesian perspective. Unlike classical bridge regression that summarizes inference using a single point estimate, the proposed Bayesian method provides uncertainty estimates of the regression parameters, allowing coherent inference through the posterior distribution. Under a sparsity assumption on the high-dimensional parameter, we provide sufficient conditions for strong posterior consistency of the Bayesian bridge prior. On simulated datasets, we show that the proposed method performs well compared to several competing methods across a wide range of scenarios. Application to two real datasets further revealed that the proposed method performs as well as or better than published methods while offering the advantage of posterior inference.

[1]  Nengjun Yi,et al.  Bayesian group bridge for bi-level variable selection , 2017, Comput. Stat. Data Anal..

[2]  S. Ghosh,et al.  Efficient sampling methods for truncated multivariate normal and student-t distributions subject to linear inequality constraints , 2015 .

[3]  N. Pillai,et al.  Dirichlet–Laplace Priors for Optimal Shrinkage , 2014, Journal of the American Statistical Association.

[4]  Himel Mallick Some contributions to bayesian regularization methods with applications to genetics and clinical trials , 2015 .

[5]  Kshitij Khare,et al.  Geometric ergodicity for Bayesian shrinkage models , 2014 .

[6]  Nengjun Yi,et al.  A New Bayesian Lasso. , 2014, Statistics and its interface.

[7]  A. U.S.,et al.  Posterior consistency in linear models under shrinkage priors , 2013 .

[8]  Kshitij Khare,et al.  Geometric ergodicity of the Bayesian lasso , 2013 .

[9]  Cheolwoo Park,et al.  Bridge regression: Adaptivity and group selection , 2011 .

[10]  James G. Scott,et al.  The Bayesian bridge , 2011, 1109.2279.

[11]  Hilario Navarro,et al.  The Effect of Non-normality in the Power Exponential Distributions , 2011 .

[12]  Chenlei Leng,et al.  Bayesian adaptive Lasso , 2010, Annals of the Institute of Statistical Mathematics.

[13]  Wang Yao,et al.  L 1/2 regularization , 2010 .

[14]  G. Casella,et al.  Penalized regression, standard errors, and Bayesian lassos , 2010 .

[15]  J. Ibrahim,et al.  Genomewide Multiple-Loci Mapping in Experimental Crosses by Iterative Adaptive Penalized Regression , 2010, Genetics.

[16]  Trevor Hastie,et al.  Regularization Paths for Generalized Linear Models via Coordinate Descent. , 2010, Journal of statistical software.

[17]  Luc Devroye,et al.  Random variate generation for exponentially and polynomially tilted stable distributions , 2009, TOMC.

[18]  Jian Huang,et al.  Penalized methods for bi-level variable selection. , 2009, Statistics and its interface.

[19]  Artin Armagan,et al.  Variational Bridge Regression , 2009, AISTATS.

[20]  Hansheng Wang,et al.  Computational Statistics and Data Analysis a Note on Adaptive Group Lasso , 2022 .

[21]  H. Zou,et al.  One-step Sparse Estimates in Nonconcave Penalized Likelihood Models. , 2008, Annals of statistics.

[22]  G. Casella,et al.  The Bayesian Lasso , 2008 .

[23]  Miguel A. Gómez-Villegas,et al.  Multivariate Exponential Power Distributions as Mixtures of Normal Distributions with Bayesian Applications , 2008 .

[24]  Chenlei Leng,et al.  Unified LASSO Estimation by Least Squares Approximation , 2007 .

[25]  Gene H. Golub,et al.  Generalized cross-validation as a method for choosing a good ridge parameter , 1979, Milestones in Matrix Computation.

[26]  H. Zou The Adaptive Lasso and Its Oracle Properties , 2006 .

[27]  Rob J Hyndman,et al.  Another look at measures of forecast accuracy , 2006 .

[28]  M. Yuan,et al.  Model selection and estimation in regression with grouped variables , 2006 .

[29]  H. Zou,et al.  Regularization and variable selection via the elastic net , 2005 .

[30]  S. Walker,et al.  Sampling Truncated Normal, Beta, and Gamma Densities , 2001 .

[31]  Wenjiang J. Fu,et al.  Asymptotics for lasso-type estimators , 2000 .

[32]  Arthur E. Hoerl,et al.  Ridge Regression: Biased Estimation for Nonorthogonal Problems , 2000, Technometrics.

[33]  Anne Philippe,et al.  Simulation of right and left truncated gamma distributions by mixtures , 1997, Stat. Comput..

[34]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[35]  T. Hastie,et al.  [A Statistical View of Some Chemometrics Regression Tools]: Discussion , 1993 .

[36]  J. Friedman,et al.  A Statistical View of Some Chemometrics Regression Tools , 1993 .

[37]  D. Rubin,et al.  Inference from Iterative Simulation Using Multiple Sequences , 1992 .

[38]  T. Stamey,et al.  Prostate specific antigen in the diagnosis and treatment of adenocarcinoma of the prostate. II. Radical prostatectomy treated patients. , 1989, The Journal of urology.

[39]  M. West On scale mixtures of normal distributions , 1987 .

[40]  G. C. McDonald,et al.  Instabilities of Regression Estimates Relating Air Pollution to Mortality , 1973 .

[41]  Penalized Regression , Wiley StatsRef: Statistics Reference Online.