An investigation of Newton-Sketch and subsampled Newton methods
暂无分享,去创建一个
Jorge Nocedal | Albert S. Berahas | Raghu Bollapragada | J. Nocedal | Raghu Bollapragada | A. Berahas
[1] Gene H. Golub,et al. Matrix computations , 1983 .
[2] David G. Luenberger,et al. Linear and nonlinear programming , 1984 .
[3] Harris Drucker,et al. Learning algorithms for classification: A comparison on handwritten digit recognition , 1995 .
[4] Stephen J. Wright,et al. Numerical Optimization , 2018, Fundamental Statistical Inference.
[5] D K Smith,et al. Numerical Optimization , 2001, J. Oper. Res. Soc..
[6] A. ADoefaa,et al. ? ? ? ? f ? ? ? ? ? , 2003 .
[7] Yurii Nesterov,et al. Introductory Lectures on Convex Optimization - A Basic Course , 2014, Applied Optimization.
[8] H. Robbins. A Stochastic Approximation Method , 1951 .
[9] Isabelle Guyon,et al. Design and Analysis of the Causation and Prediction Challenge , 2008, WCCI Causation and Prediction Challenge.
[10] Chih-Jen Lin,et al. LIBSVM: A library for support vector machines , 2011, TIST.
[11] Rachel Ward,et al. New and Improved Johnson-Lindenstrauss Embeddings via the Restricted Isometry Property , 2010, SIAM J. Math. Anal..
[12] Jorge Nocedal,et al. On the Use of Stochastic Hessian Information in Optimization Methods for Machine Learning , 2011, SIAM J. Optim..
[13] S. Muthukrishnan,et al. Faster least squares approximation , 2007, Numerische Mathematik.
[14] Tong Zhang,et al. Accelerating Stochastic Gradient Descent using Predictive Variance Reduction , 2013, NIPS.
[15] Stephen J. Wright,et al. Optimization for Machine Learning , 2013 .
[16] Yoram Singer,et al. Parallel Boosting with Momentum , 2013, ECML/PKDD.
[17] Francis Bach,et al. SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives , 2014, NIPS.
[18] David P. Woodruff. Sketching as a Tool for Numerical Linear Algebra , 2014, Found. Trends Theor. Comput. Sci..
[19] Stephen J. Wright. Coordinate descent algorithms , 2015, Mathematical Programming.
[20] Andrea Montanari,et al. Convergence rates of sub-sampled Newton methods , 2015, NIPS.
[21] Haipeng Luo,et al. Efficient Second Order Online Learning by Sketching , 2016, NIPS.
[22] Michael W. Mahoney,et al. Sub-Sampled Newton Methods I: Globally Convergent Algorithms , 2016, ArXiv.
[23] J. Nocedal,et al. Exact and Inexact Subsampled Newton Methods for Optimization , 2016, 1609.08502.
[24] Peng Xu,et al. Sub-sampled Newton Methods with Non-uniform Sampling , 2016, NIPS.
[25] Michael W. Mahoney,et al. Sub-Sampled Newton Methods II: Local Convergence Rates , 2016, ArXiv.
[26] Naman Agarwal,et al. Second Order Stochastic Optimization in Linear Time , 2016, ArXiv.
[27] Artin,et al. SARAH : A Novel Method for Machine Learning Problems Using Stochastic Recursive Gradient , 2017 .
[28] Mark W. Schmidt,et al. Minimizing finite sums with the stochastic average gradient , 2013, Mathematical Programming.
[29] Lan Wang,et al. Sparse Concordance-assisted Learning for Optimal Treatment Decision , 2018, J. Mach. Learn. Res..
[30] Mladen Kolar,et al. Sketching Meets Random Projection in the Dual: A Provable Recovery Algorithm for Big and High-dimensional Data , 2016, AISTATS.
[31] Ohad Shamir,et al. Oracle Complexity of Second-Order Methods for Finite-Sum Problems , 2016, ICML.
[32] Shusen Wang,et al. Sketched Ridge Regression: Optimization Perspective, Statistical Perspective, and Model Averaging , 2017, ICML.
[33] Martin J. Wainwright,et al. Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence , 2015, SIAM J. Optim..
[34] Naman Agarwal,et al. Second-Order Stochastic Optimization for Machine Learning in Linear Time , 2016, J. Mach. Learn. Res..
[35] Peng Xu,et al. Inexact Non-Convex Newton-Type Methods , 2018, 1802.06925.
[36] Jorge Nocedal,et al. Optimization Methods for Large-Scale Machine Learning , 2016, SIAM Rev..
[37] Michael W. Mahoney,et al. Sub-sampled Newton methods , 2018, Math. Program..
[38] Peng Xu,et al. Second-Order Optimization for Non-Convex Machine Learning: An Empirical Study , 2017, SDM.
[39] Peng Xu,et al. Newton-type methods for non-convex optimization under inexact Hessian information , 2017, Math. Program..
[40] P. Alam. ‘S’ , 2021, Composites Engineering: An A–Z Guide.