Sparse Learning for Stochastic Composite Optimization
暂无分享,去创建一个
Rong Jin | Xuelong Li | Xiaofei He | Lijun Zhang | Deng Cai | Weizhong Zhang | Zhongming Jin | Ronghua Liang
[1] Xi Chen,et al. Optimal Regularized Dual Averaging Methods for Stochastic Optimization , 2012, NIPS.
[2] Saeed Ghadimi,et al. Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework , 2012, SIAM J. Optim..
[3] Guanghui Lan,et al. An optimal method for stochastic composite optimization , 2011, Mathematical Programming.
[4] Ohad Shamir,et al. Making Gradient Descent Optimal for Strongly Convex Stochastic Optimization , 2011, ICML.
[5] Lin Xiao,et al. Dual Averaging Methods for Regularized Stochastic Learning and Online Optimization , 2009, J. Mach. Learn. Res..
[6] Yoram Singer,et al. Efficient Online and Batch Learning Using Forward Backward Splitting , 2009, J. Mach. Learn. Res..
[7] Emmanuel J. Candès,et al. NESTA: A Fast and Accurate First-Order Method for Sparse Recovery , 2009, SIAM J. Imaging Sci..
[8] I. Daubechies,et al. Iteratively reweighted least squares minimization for sparse recovery , 2008, 0807.0575.
[9] John Langford,et al. Sparse Online Learning via Truncated Gradient , 2008, NIPS.
[10] Wotao Yin,et al. Iteratively reweighted algorithms for compressive sensing , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.
[11] Y. Nesterov. Gradient methods for minimizing composite objective function , 2007 .
[12] Elad Hazan,et al. Logarithmic regret algorithms for online convex optimization , 2006, Machine Learning.
[13] Yoram Singer,et al. Data-Driven Online to Batch Conversions , 2005, NIPS.
[14] Nick Littlestone,et al. From on-line to batch learning , 1989, COLT '89.
[15] Qihang Lin. A Sparsity Preserving Stochastic Gradient Method for Composite Optimization , 2011 .
[16] Elad Hazan. 24th Annual Conference on Learning Theory Beyond the regret minimization barrier: an optimal algorithm for stochastic strongly-convex optimization , 2022 .