Sparse regression at scale: branch-and-bound rooted in first-order optimization
暂无分享,去创建一个
[1] T. Rutherford,et al. Nonlinear Programming , 2021, Mathematical Programming Methods for Geographers and Planners.
[2] Rahul Mazumder,et al. Discussion of “Best Subset, Forward Stepwise or Lasso? Analysis and Recommendations Based on Extensive Comparisons” , 2020 .
[3] Alper Atamtürk,et al. Safe Screening Rules for $\ell_0$-Regression. , 2020, 2004.08773.
[4] R. Mazumder,et al. Learning Sparse Classifiers: Continuous and Mixed Integer Optimization Perspectives , 2020, J. Mach. Learn. Res..
[5] Cees G. M. Snoek,et al. Variable Selection , 2019, Model-Based Clustering and Classification for Data Science.
[6] Dimitris Bertsimas,et al. Sparse Regression: Scalable Algorithms and Empirical Performance , 2019, Statistical Science.
[7] Rahul Mazumder,et al. Learning Hierarchical Interactions at Scale: A Convex Optimization Approach , 2019, AISTATS.
[8] Alper Atamtürk,et al. Rank-one Convexification for Sparse Regression , 2019, ArXiv.
[9] Weijun Xie,et al. Scalable Algorithms for the Sparse Ridge Regression , 2018, SIAM J. Optim..
[10] Hussein Hazimeh,et al. Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms , 2018, Oper. Res..
[11] Dimitris Bertsimas,et al. Sparse classification: a scalable discrete optimization perspective , 2017, Machine Learning.
[12] Bart P. G. Van Parys,et al. Sparse high-dimensional regression: Exact scalable algorithms and phase transitions , 2017, The Annals of Statistics.
[13] P. Radchenko,et al. Subset Selection with Shrinkage: Sparse Linear Modeling When the SNR Is Low , 2017, Oper. Res..
[14] R. Tibshirani,et al. Extended Comparisons of Best Subset Selection, Forward Stepwise Selection, and the Lasso , 2017, 1707.08692.
[15] David Gamarnik,et al. High Dimensional Regression with Binary Coefficients. Estimating Squared Error and a Phase Transtition , 2017, COLT.
[16] Russell Bent,et al. Extended Formulations in Mixed-Integer Convex Programming , 2015, IPCO.
[17] Siu Kwan Lam,et al. Numba: a LLVM-based Python JIT compiler , 2015, LLVM '15.
[18] Jeff T. Linderoth,et al. Regularization vs. Relaxation: A conic optimization perspective of statistical variable selection , 2015, ArXiv.
[19] D. Bertsimas,et al. Best Subset Selection via a Modern Optimization Lens , 2015, 1507.03133.
[20] Iain Dunning,et al. Extended formulations in mixed integer conic quadratic programming , 2015, Mathematical Programming Computation.
[21] Martin J. Wainwright,et al. Sparse learning via Boolean relaxations , 2015, Mathematical Programming.
[22] Ryuhei Miyashiro,et al. Subset selection by Mallows' Cp: A mixed integer programming approach , 2015, Expert Syst. Appl..
[23] David C. Miller,et al. Learning surrogate models for simulation‐based optimization , 2014 .
[24] Martin J. Wainwright,et al. Lower bounds on the performance of polynomial-time algorithms for sparse linear regression , 2014, COLT.
[25] Peter Bühlmann,et al. High-Dimensional Statistics with a View Toward Applications in Biology , 2014 .
[26] Zhi-Quan Luo,et al. Iteration complexity analysis of block coordinate descent methods , 2013, Mathematical Programming.
[27] Amir Beck,et al. On the Convergence of Block Coordinate Descent Type Methods , 2013, SIAM J. Optim..
[28] Christian Kirches,et al. Mixed-integer nonlinear optimization*† , 2013, Acta Numerica.
[29] Yurii Nesterov,et al. Gradient methods for minimizing composite functions , 2012, Mathematical Programming.
[30] Yurii Nesterov,et al. Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems , 2012, SIAM J. Optim..
[31] Yonina C. Eldar,et al. Sparsity Constrained Nonlinear Optimization: Optimality Conditions and Algorithms , 2012, SIAM J. Optim..
[32] Sven Leyffer,et al. Mixed Integer Nonlinear Programming , 2011 .
[33] T. Hastie,et al. SparseNet: Coordinate Descent With Nonconvex Penalties , 2011, Journal of the American Statistical Association.
[34] David L. Woodruff,et al. Pyomo: modeling and solving mathematical programs in Python , 2011, Math. Program. Comput..
[35] Oktay Günlük,et al. Perspective reformulations of mixed integer nonlinear programs with indicator variables , 2010, Math. Program..
[36] Trevor Hastie,et al. Regularization Paths for Generalized Linear Models via Coordinate Descent. , 2010, Journal of statistical software.
[37] Bin Yu,et al. Minimax Rates of Estimation for High-Dimensional Linear Regression Over q -Balls , 2009, IEEE Trans. Inf. Theory.
[38] Sinan Gürel,et al. A strong conic quadratic reformulation for machine-job assignment with controllable processing times , 2009, Oper. Res. Lett..
[39] Sven Hammarling,et al. Updating the QR factorization and the least squares problem , 2008 .
[40] George L. Nemhauser,et al. A Lifted Linear Programming Branch-and-Bound Algorithm for Mixed-Integer Conic Quadratic Programs , 2008, INFORMS J. Comput..
[41] Mike E. Davies,et al. Iterative Hard Thresholding for Compressed Sensing , 2008, ArXiv.
[42] Vivek K Goyal,et al. Necessary and Sufficient Conditions on Sparsity Pattern Recovery , 2008, ArXiv.
[43] Martin J. Wainwright,et al. Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting , 2007, IEEE Transactions on Information Theory.
[44] Peng Zhao,et al. On Model Selection Consistency of Lasso , 2006, J. Mach. Learn. Res..
[45] E. Greenshtein. Best subset selection, persistence in high-dimensional statistical learning and optimization under l1 constraint , 2006, math/0702684.
[46] C. Gentile,et al. Perspective cuts for a class of convex 0–1 mixed integer programs , 2006, Math. Program..
[47] Nikolaos V. Sahinidis,et al. A polyhedral branch-and-cut approach to global optimization , 2005, Math. Program..
[48] Erling D. Andersen,et al. On implementing a primal-dual interior-point method for conic quadratic optimization , 2003, Math. Program..
[49] P. Tseng. Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization , 2001 .
[50] Balas K. Natarajan,et al. Sparse Approximate Solutions to Linear Systems , 1995, SIAM J. Comput..
[51] Ignacio E. Grossmann,et al. An outer-approximation algorithm for a class of mixed-integer nonlinear programs , 1986, Math. Program..
[52] Dimitris Bertsimas,et al. Rejoiner-Sparse regression: Scalable algorithms and empirical performance , 2020 .
[54] Garvesh Raskutti,et al. Minimax rates of estimation for high-dimensional linear regression over l q-balls , 2011 .
[55] A. Owen. A robust hybrid of lasso and ridge regression , 2006 .
[56] R. Bixby,et al. On the Solution of Traveling Salesman Problems , 1998 .
[57] R. Tibshirani. Regression Shrinkage and Selection via the Lasso , 1996 .
[58] R. J. Dakin,et al. A tree-search algorithm for mixed integer programming problems , 1965, Comput. J..
[59] T. Koch,et al. Branching rules revisited , 2005, Oper. Res. Lett..