Randomized Block Cubic Newton Method
暂无分享,去创建一个
[1] Nicholas I. M. Gould,et al. Trust Region Methods , 2000, MOS-SIAM Series on Optimization.
[2] Nicholas I. M. Gould,et al. On solving trust-region and other regularised subproblems in optimization , 2010, Math. Program. Comput..
[3] Yurii Nesterov,et al. Accelerating the cubic regularization of Newton’s method on convex problems , 2005, Math. Program..
[4] Katya Scheinberg,et al. Global convergence rate analysis of unconstrained optimization methods based on probabilistic models , 2015, Mathematical Programming.
[5] Peter Richtárik,et al. Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function , 2011, Mathematical Programming.
[6] Yair Carmon,et al. Gradient Descent Efficiently Finds the Cubic-Regularized Non-Convex Newton Step , 2016, ArXiv.
[7] Tengyu Ma,et al. Finding approximate local minima faster than gradient descent , 2016, STOC.
[8] Paul Tseng,et al. A coordinate gradient descent method for nonsmooth separable minimization , 2008, Math. Program..
[9] Nicholas I. M. Gould,et al. Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity , 2011, Math. Program..
[10] Yurii Nesterov,et al. Gradient methods for minimizing composite functions , 2012, Mathematical Programming.
[11] Martin J. Wainwright,et al. Randomized sketches of convex programs with sharp guarantees , 2014, 2014 IEEE International Symposium on Information Theory.
[12] 丸山 徹. Convex Analysisの二,三の進展について , 1977 .
[13] Shai Shalev-Shwartz,et al. Stochastic dual coordinate ascent methods for regularized loss , 2012, J. Mach. Learn. Res..
[14] D. Bertsekas. Local Convex Conjugacy and Fenchel Duality , 1978 .
[15] Saeed Ghadimi,et al. Second-Order Methods with Cubic Regularization Under Inexact Information , 2017, 1710.05782.
[16] Peter Richtárik,et al. SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization , 2015, ICML.
[17] Peter Richtárik,et al. Parallel Stochastic Newton Method , 2017, Journal of Computational Mathematics.
[18] Yurii Nesterov,et al. Modified Gauss–Newton scheme with worst case guarantees for global performance , 2007, Optim. Methods Softw..
[19] Yurii Nesterov,et al. Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians , 2017, SIAM J. Optim..
[20] Tengyu Ma,et al. Finding Approximate Local Minima for Nonconvex Optimization in Linear Time , 2016, ArXiv.
[21] Peter Richtárik,et al. Parallel coordinate descent methods for big data optimization , 2012, Mathematical Programming.
[22] Nicholas I. M. Gould,et al. Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results , 2011, Math. Program..
[23] Peter Richtárik,et al. Coordinate descent with arbitrary sampling II: expected separable overapproximation , 2014, Optim. Methods Softw..
[24] Yurii Nesterov,et al. Cubic regularization of Newton method and its global performance , 2006, Math. Program..
[25] Aurélien Lucchi,et al. Sub-sampled Cubic Regularization for Non-convex Optimization , 2017, ICML.
[26] Michael I. Jordan,et al. Stochastic Cubic Regularization for Fast Nonconvex Optimization , 2017, NeurIPS.