Optimization Methods for Fully Composite Problems

In this paper, we propose a new Fully Composite Formulation of convex optimization problems. It includes, as a particular case, the problems with functional constraints, max-type minimization problems, and problems of Composite Minimization, where the objective can have simple nondifferentiable components. We treat all these formulations in a unified way, highlighting the existence of very natural optimization schemes of different order. We prove the global convergence rates for our methods under the most general conditions. Assuming that the upper-level component of our objective function is subhomogeneous, we develop efficient modification of the basic Fully Composite first-order and second-order Methods, and propose their accelerated variants.

[1]  Yurii Nesterov,et al.  Cubic regularization of Newton method and its global performance , 2006, Math. Program..

[2]  Yurii Nesterov,et al.  Affine-invariant contracting-point methods for Convex Optimization , 2020, Mathematical Programming.

[3]  Renato D. C. Monteiro,et al.  An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods , 2013, SIAM J. Optim..

[4]  Yurii Nesterov,et al.  Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method , 2019, Journal of Optimization Theory and Applications.

[5]  Yurii Nesterov,et al.  Stochastic Subspace Cubic Newton Method , 2020, ICML.

[6]  Yurii Nesterov,et al.  Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians , 2017, SIAM J. Optim..

[7]  Ohad Shamir,et al.  Oracle complexity of second-order methods for smooth convex optimization , 2017, Mathematical Programming.

[8]  Yurii Nesterov,et al.  Accelerating the cubic regularization of Newton’s method on convex problems , 2005, Math. Program..

[9]  Yurii Nesterov,et al.  Interior-point polynomial algorithms in convex programming , 1994, Siam studies in applied mathematics.

[10]  Katya Scheinberg,et al.  Global convergence rate analysis of unconstrained optimization methods based on probabilistic models , 2015, Mathematical Programming.

[11]  Yin Tat Lee,et al.  Near-optimal method for highly smooth convex optimization , 2018, COLT.

[12]  Yurii Nesterov,et al.  Contracting Proximal Methods for Smooth Convex Optimization , 2019, SIAM J. Optim..

[13]  Yurii Nesterov,et al.  Lectures on Convex Optimization , 2018 .

[14]  Y. Nesterov A method for solving the convex programming problem with convergence rate O(1/k^2) , 1983 .

[15]  Yurii Nesterov,et al.  Complexity bounds for primal-dual methods minimizing the model of objective function , 2017, Mathematical Programming.

[16]  Yurii Nesterov,et al.  Inexact Tensor Methods with Dynamic Accuracies , 2020, ICML.

[17]  Y. Nesterov,et al.  Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives , 2019, SIAM J. Optim..

[18]  Alexandre d'Aspremont,et al.  Acceleration Methods , 2021, Found. Trends Optim..

[19]  Y. Nesterov,et al.  Convex optimization based on global lower second-order models , 2020, NeurIPS.

[20]  Osman Güler,et al.  New Proximal Point Algorithms for Convex Minimization , 1992, SIAM J. Optim..

[21]  Yurii Nesterov,et al.  Gradient methods for minimizing composite functions , 2012, Mathematical Programming.

[22]  Shuzhong Zhang,et al.  An Optimal High-Order Tensor Method for Convex Optimization , 2019, COLT.

[23]  Yurii Nesterov,et al.  Implementable tensor methods in unconstrained convex optimization , 2019, Mathematical Programming.

[24]  Philip Wolfe,et al.  An algorithm for quadratic programming , 1956 .

[25]  Anton Rodomanov,et al.  Smoothness Parameter of Power of Euclidean Norm , 2019, Journal of Optimization Theory and Applications.