Fast distributed coordinate descent for non-strongly convex losses

We propose an efficient distributed randomized coordinate descent method for minimizing regularized non-strongly convex loss functions. The method attains the optimal O(1/k2) convergence rate, where k is the iteration counter. The core of the work is the theoretical study of stepsize parameters. We have implemented the method on Archer - the largest super-computer in the UK-and show that the method is capable of solving a (synthetic) LASSO optimization problem with 50 billion variables.