暂无分享,去创建一个
Volkan Cevher | Stephen J. Wright | Ahmet Alacaoglu | Stephen J. Wright | V. Cevher | Ahmet Alacaoglu | A. Alacaoglu
[1] Lin Xiao,et al. A Proximal Stochastic Gradient Method with Progressive Variance Reduction , 2014, SIAM J. Optim..
[2] Alexander Shapiro,et al. Stochastic Approximation approach to Stochastic Programming , 2013 .
[3] Shai Shalev-Shwartz,et al. Stochastic dual coordinate ascent methods for regularized loss , 2012, J. Mach. Learn. Res..
[4] Yuchen Zhang,et al. Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization , 2014, ICML.
[5] Zeyuan Allen Zhu,et al. Katyusha: the first direct acceleration of stochastic gradient methods , 2017, STOC.
[6] Heinz H. Bauschke,et al. Convex Analysis and Monotone Operator Theory in Hilbert Spaces , 2011, CMS Books in Mathematics.
[7] Stephen J. Wright,et al. Coordinate Linear Variance Reduction for Generalized Linear Programming , 2021, ArXiv.
[8] Antonin Chambolle,et al. Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications , 2017, SIAM J. Optim..
[9] Stephen J. Wright. Coordinate descent algorithms , 2015, Mathematical Programming.
[10] Pascal Bianchi,et al. A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions , 2015, SIAM J. Optim..
[11] Stephen J. Wright,et al. Variance Reduction via Primal-Dual Accelerated Dual Averaging for Nonsmooth Convex Finite-Sums , 2021, ICML.
[12] Tong Zhang,et al. Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization , 2013, Mathematical Programming.
[13] Tong Zhang,et al. Accelerated dual-averaging primal–dual method for composite convex minimization , 2020, Optim. Methods Softw..
[14] On the convergence of stochastic primal-dual hybrid gradient , 2019, 1911.00799.
[15] Antonin Chambolle,et al. A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging , 2011, Journal of Mathematical Imaging and Vision.
[16] Yura Malitsky,et al. Stochastic Variance Reduction for Variational Inequality Methods , 2021, ArXiv.
[17] 丸山 徹. Convex Analysisの二,三の進展について , 1977 .
[18] Zeyuan Allen-Zhu,et al. Katyusha: the first direct acceleration of stochastic gradient methods , 2016, J. Mach. Learn. Res..
[19] Yurii Nesterov,et al. Primal-dual subgradient methods for convex problems , 2005, Math. Program..
[20] Kevin Tian,et al. Variance Reduction for Matrix Games , 2019, NeurIPS.
[21] Guanghui Lan,et al. First-order and Stochastic Optimization Methods for Machine Learning , 2020 .
[22] Volkan Cevher,et al. A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization , 2015, SIAM J. Optim..
[23] Kevin Tian,et al. Coordinate Methods for Matrix Games , 2020, 2020 IEEE 61st Annual Symposium on Foundations of Computer Science (FOCS).
[24] Volkan Cevher,et al. Random extrapolation for primal-dual coordinate descent , 2020, ICML.
[25] Yurii Nesterov,et al. Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems , 2012, SIAM J. Optim..
[26] Yurii Nesterov,et al. Dual extrapolation and its applications to solving variational inequalities and related problems , 2003, Math. Program..
[27] Antonin Chambolle,et al. On the ergodic convergence rates of a first-order primal–dual algorithm , 2016, Math. Program..