Accelerated Kaczmarz Algorithms using History Information

The Kaczmarz algorithm is a well known iterative method for solving overdetermined linear systems. Its randomized version yields provably exponential convergence in expectation. In this paper, we propose two new methods to speed up the randomized Kaczmarz algorithm by utilizing the past estimates in the iterations. The first one utilize the past estimates to get a preconditioner. The second one combines the stochastic average gradient (SAG) method with the randomized Kaczmarz algorithm. It takes advantage of past gradients to improve the convergence speed. Numerical experiments indicate that the new algorithms can dramatically outperform the standard randomized Kaczmarz algorithm.

[1]  Simon Günter,et al.  A Stochastic Quasi-Newton Method for Online Convex Optimization , 2007, AISTATS.

[2]  Deanna Needell,et al.  Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm , 2013, Mathematical Programming.

[3]  Mark W. Schmidt,et al.  Minimizing finite sums with the stochastic average gradient , 2013, Mathematical Programming.

[4]  S. Shalev-Shwartz,et al.  Stochastic Gradient Descent , 2014 .

[5]  Yoram Singer,et al.  Adaptive Subgradient Methods for Online Learning and Stochastic Optimization , 2011, J. Mach. Learn. Res..

[6]  W. B. Bearhart,et al.  Acceleration schemes for the method of alternating projections , 1989 .

[7]  Tong Zhang,et al.  Accelerating Stochastic Gradient Descent using Predictive Variance Reduction , 2013, NIPS.

[8]  Yousef Saad,et al.  Iterative methods for sparse linear systems , 2003 .

[9]  R. Vershynin,et al.  A Randomized Kaczmarz Algorithm with Exponential Convergence , 2007, math/0702226.

[10]  Julien Mairal,et al.  Optimization with First-Order Surrogate Functions , 2013, ICML.

[11]  D. Needell Randomized Kaczmarz solver for noisy linear systems , 2009, 0902.0958.

[12]  Atsushi Nitanda,et al.  Stochastic Proximal Gradient Descent with Acceleration Techniques , 2014, NIPS.

[13]  Francis Bach,et al.  SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives , 2014, NIPS.

[14]  Stephen J. Wright,et al.  An accelerated randomized Kaczmarz algorithm , 2013, Math. Comput..

[15]  Marcus A. Magnor,et al.  A sparse Kaczmarz solver and a linearized Bregman method for online compressed sensing , 2014, 2014 IEEE International Conference on Image Processing (ICIP).

[16]  Mark W. Schmidt,et al.  A Stochastic Gradient Method with an Exponential Convergence Rate for Finite Training Sets , 2012, NIPS.

[17]  Adrian S. Lewis,et al.  Randomized Methods for Linear Constraints: Convergence Rates and Conditioning , 2008, Math. Oper. Res..

[18]  Yonina C. Eldar,et al.  Acceleration of randomized Kaczmarz method via the Johnson–Lindenstrauss Lemma , 2010, Numerical Algorithms.

[19]  Nikolaos M. Freris,et al.  Randomized Extended Kaczmarz for Solving Least Squares , 2012, SIAM J. Matrix Anal. Appl..

[20]  Shai Shalev-Shwartz,et al.  Stochastic dual coordinate ascent methods for regularized loss , 2012, J. Mach. Learn. Res..

[21]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.