Contracting Proximal Methods for Smooth Convex Optimization

In this paper, we propose new accelerated methods for smooth Convex Optimization, called Contracting Proximal Methods. At every step of these methods, we need to minimize a contracted version of the objective function augmented by a regularization term in the form of Bregman divergence. We provide global convergence analysis for a general scheme admitting inexactness in solving the auxiliary subproblem. In the case of using for this purpose high-order Tensor Methods, we demonstrate an acceleration effect for both convex and uniformly convex composite objective function. Thus, our construction explains acceleration for methods of any order starting from one. The augmentation of the number of calls of oracle due to computing the contracted proximal steps, is limited by the logarithmic factor in the worst-case complexity bound.

[1]  Alexander Gasnikov,et al.  Adaptive Catalyst for Smooth Convex Optimization , 2019, OPTIMA.

[2]  John Darzentas,et al.  Problem Complexity and Method Efficiency in Optimization , 1983 .

[3]  Y. Nesterov A method for solving the convex programming problem with convergence rate O(1/k^2) , 1983 .

[4]  Yurii Nesterov,et al.  Implementable tensor methods in unconstrained convex optimization , 2019, Mathematical Programming.

[5]  Local convergence of tensor methods , 2019, 1912.02516.

[6]  Osman Güer On the convergence of the proximal point algorithm for convex minimization , 1991 .

[7]  Yurii Nesterov,et al.  Relatively Smooth Convex Optimization by First-Order Methods, and Applications , 2016, SIAM J. Optim..

[8]  Saverio Salzo,et al.  Inexact and accelerated proximal point algorithms , 2011 .

[9]  M. Solodov,et al.  A UNIFIED FRAMEWORK FOR SOME INEXACT PROXIMAL POINT ALGORITHMS , 2001 .

[10]  Renato D. C. Monteiro,et al.  An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods , 2013, SIAM J. Optim..

[11]  R. Rockafellar Monotone Operators and the Proximal Point Algorithm , 1976 .

[12]  Yurii Nesterov,et al.  Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions , 2019, SIAM J. Optim..

[13]  Zaïd Harchaoui,et al.  A Universal Catalyst for First-Order Optimization , 2015, NIPS.

[14]  Ohad Shamir,et al.  Oracle complexity of second-order methods for smooth convex optimization , 2017, Mathematical Programming.

[15]  Yurii Nesterov,et al.  Accelerating the cubic regularization of Newton’s method on convex problems , 2005, Math. Program..

[16]  Q. Nguyen Forward-Backward Splitting with Bregman Distances , 2015, 1505.05198.

[17]  Osman Güler,et al.  New Proximal Point Algorithms for Convex Minimization , 1992, SIAM J. Optim..

[18]  Peter Richtárik,et al.  Randomized Block Cubic Newton Method , 2018, ICML.

[19]  Yurii Nesterov,et al.  Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method , 2019, Journal of Optimization Theory and Applications.

[20]  Mark W. Schmidt,et al.  Convergence Rates of Inexact Proximal-Gradient Methods for Convex Optimization , 2011, NIPS.

[21]  G. Aleksandr,et al.  Adaptive Catalyst for Smooth Convex Optimization , 2019 .

[22]  Yurii Nesterov,et al.  Gradient methods for minimizing composite functions , 2012, Mathematical Programming.

[23]  Yurii Nesterov,et al.  Cubic regularization of Newton method and its global performance , 2006, Math. Program..

[24]  Marc Teboulle,et al.  A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications , 2017, Math. Oper. Res..