Parallel coordinate descent methods for big data optimization
暂无分享,去创建一个
[1] Yurii Nesterov,et al. Subgradient methods for huge-scale optimization problems , 2013, Mathematical Programming.
[2] Yurii Nesterov,et al. Introductory Lectures on Convex Optimization - A Basic Course , 2014, Applied Optimization.
[3] Ming Yan,et al. Parallel and distributed sparse optimization , 2013, 2013 Asilomar Conference on Signals, Systems and Computers.
[4] Peter Richtárik,et al. On optimal probabilities in stochastic coordinate descent methods , 2013, Optim. Lett..
[5] Peter Richtárik,et al. Distributed Coordinate Descent Method for Learning with Big Data , 2013, J. Mach. Learn. Res..
[6] Peter Richtárik,et al. Inexact Coordinate Descent: Complexity and Preconditioning , 2013, J. Optim. Theory Appl..
[7] Ambuj Tewari,et al. On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods , 2013, SIAM J. Optim..
[8] Avleen Singh Bijral,et al. Mini-Batch Primal and Dual Methods for SVMs , 2013, ICML.
[9] Ion Necoara,et al. A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints , 2013, Comput. Optim. Appl..
[10] Ion Necoara,et al. Efficient parallel coordinate descent algorithm for convex optimization problems with separable constraints: Application to distributed MPC , 2013, 1302.3092.
[11] Inderjit S. Dhillon,et al. Scalable Coordinate Descent Approaches to Parallel Matrix Factorization for Recommender Systems , 2012, 2012 IEEE 12th International Conference on Data Mining.
[12] Ambuj Tewari,et al. Feature Clustering for Accelerating Parallel Coordinate Descent , 2012, NIPS.
[13] Yurii Nesterov,et al. Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems , 2012, SIAM J. Optim..
[14] Pradeep Ravikumar,et al. Nearest Neighbor based Greedy Coordinate Descent , 2011, NIPS.
[15] Peter Richtárik,et al. Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function , 2011, Mathematical Programming.
[16] Stephen J. Wright,et al. Hogwild: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent , 2011, NIPS.
[17] Joseph K. Bradley,et al. Parallel Coordinate Descent for L1-Regularized Loss Minimization , 2011, ICML.
[18] Alejandro Ribeiro,et al. Accelerated dual descent for network optimization , 2011, Proceedings of the 2011 American Control Conference.
[19] S. Osher,et al. Coordinate descent optimization for l 1 minimization with application to compressed sensing; a greedy algorithm , 2009 .
[20] Adrian S. Lewis,et al. Randomized Methods for Linear Constraints: Convergence Rates and Conditioning , 2008, Math. Oper. Res..
[21] Y. Nesterov. Gradient methods for minimizing composite objective function , 2007 .
[22] R. Vershynin,et al. A Randomized Kaczmarz Algorithm with Exponential Convergence , 2007, math/0702226.
[23] Andrzej Ruszczynski,et al. On Convergence of an Augmented Lagrangian Decomposition Method for Sparse Convex Optimization , 1995, Math. Oper. Res..
[24] Jinchao Xu,et al. Iterative Methods by Space Decomposition and Subspace Correction , 1992, SIAM Rev..
[25] Peter Richtárik,et al. Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design , 2011, OR.
[26] Peter Richtárik,et al. Efficiency of randomized coordinate descent methods on minimization problems with a composite objective function , 2011 .
[27] S. Shalev-Shwartz,et al. Stochastic Methods for l1-regularized Loss Minimization , 2011, J. Mach. Learn. Res..