Accelerating the cubic regularization of Newton’s method on convex problems

In this paper we propose an accelerated version of the cubic regularization of Newton’s method (Nesterov and Polyak, in Math Program 108(1): 177–205, 2006). The original version, used for minimizing a convex function with Lipschitz-continuous Hessian, guarantees a global rate of convergence of order $$O\big({1 \over k^2}\big)$$, where k is the iteration counter. Our modified version converges for the same problem class with order $$O\big({1 \over k^3}\big)$$, keeping the complexity of each iteration unchanged. We study the complexity of both schemes on different classes of convex problems. In particular, we argue that for the second-order schemes, the class of non-degenerate problems is different from the standard class.

[1]  A. A. Bennett Newton's Method in General Analysis. , 1916, Proceedings of the National Academy of Sciences of the United States of America.

[2]  L. Kantorovich,et al.  Functional analysis and applied mathematics , 1963 .

[3]  James M. Ortega,et al.  Iterative solution of nonlinear equations in several variables , 2014, Computer science and applied mathematics.

[4]  John E. Dennis,et al.  Numerical methods for unconstrained optimization and nonlinear equations , 1983, Prentice Hall series in computational mathematics.

[5]  R. D. Murphy,et al.  Iterative solution of nonlinear equations , 1994 .

[6]  Nicholas I. M. Gould,et al.  Trust Region Methods , 2000, MOS-SIAM Series on Optimization.

[7]  Yurii Nesterov,et al.  Introductory Lectures on Convex Optimization - A Basic Course , 2014, Applied Optimization.

[8]  Yurii Nesterov,et al.  Cubic regularization of Newton method and its global performance , 2006, Math. Program..