The global rate of convergence for optimal tensor methods in smooth convex optimization

We consider convex optimization problems with the objective function having Lipshitz-continuous $p$-th order derivative, where $p\geq 1$. We propose a new tensor method, which closes the gap between the lower $O\left(\varepsilon^{-\frac{2}{3p+1}} \right)$ and upper $O\left(\varepsilon^{-\frac{1}{p+1}} \right)$ iteration complexity bounds for this class of optimization problems. We also consider uniformly convex functions, and show how the proposed method can be accelerated under this additional assumption. Moreover, we introduce a $p$-th order condition number which naturally arises in the complexity analysis of tensor methods under this assumption. Finally, we make a numerical study of the proposed optimal method and show that in practice it is faster than the best known accelerated tensor method. We also compare the performance of tensor methods for $p=2$ and $p=3$ and show that the 3rd-order method is superior to the 2nd-order method in practice.

[1]  Yin Tat Lee,et al.  Near-optimal method for highly smooth convex optimization , 2018, COLT.

[2]  Shuzhong Zhang,et al.  An Optimal High-Order Tensor Method for Convex Optimization , 2019, COLT.

[3]  Evgeniya A. Vorontsova,et al.  Accelerated Directional Search with Non-Euclidean Prox-Structure , 2017, Automation and remote control.

[4]  Ohad Shamir,et al.  Oracle complexity of second-order methods for smooth convex optimization , 2017, Mathematical Programming.

[5]  Yurii Nesterov,et al.  Lectures on Convex Optimization , 2018 .

[6]  Alexander Gasnikov,et al.  A hypothesis about the rate of global convergence for optimal methods (Newtons type) in smooth convex optimization , 2018, Computer Research and Modeling.

[7]  Naman Agarwal,et al.  Lower Bounds for Higher-Order Convex Optimization , 2017, COLT.

[8]  Zaïd Harchaoui,et al.  Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice , 2017, J. Mach. Learn. Res..

[9]  A. Gasnikov Universal gradient descent , 2017, 1711.00394.

[10]  Saeed Ghadimi,et al.  Second-Order Methods with Cubic Regularization Under Inexact Information , 2017, 1710.05782.

[11]  Nicholas I. M. Gould,et al.  Improved second-order evaluation complexity for unconstrained nonlinear optimization using high-order regularized models , 2017, ArXiv.

[12]  Alexander Gasnikov,et al.  Randomized Similar Triangles Method: A Unifying Framework for Accelerated Randomized Optimization Methods (Coordinate Descent, Directional Search, Derivative-Free Method) , 2017, ArXiv.

[13]  José Mario Martínez,et al.  Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models , 2017, Math. Program..

[14]  Yurii Nesterov,et al.  Random Gradient-Free Minimization of Convex Functions , 2015, Foundations of Computational Mathematics.

[15]  Yurii Nesterov,et al.  Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians , 2017, SIAM J. Optim..

[16]  Anastasia A. Lagunovskaya,et al.  Gradient-free two-points optimal method for non smooth stochastic convex optimization problem with additional small noise , 2017, 1701.03821.

[17]  Andre Wibisono,et al.  A variational perspective on accelerated methods in optimization , 2016, Proceedings of the National Academy of Sciences.

[18]  Yin Tat Lee,et al.  A Faster Cutting Plane Method and its Implications for Combinatorial and Convex Optimization , 2015, 2015 IEEE 56th Annual Symposium on Foundations of Computer Science.

[19]  Sébastien Bubeck,et al.  Convex Optimization: Algorithms and Complexity , 2014, Found. Trends Mach. Learn..

[20]  Renato D. C. Monteiro,et al.  An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods , 2013, SIAM J. Optim..

[21]  Chee Wei Tan,et al.  Introduction to Convex Optimization , 2010 .

[22]  M. Baes Estimate sequence methods: extensions and approximations , 2009 .

[23]  Yurii Nesterov,et al.  Accelerating the cubic regularization of Newton’s method on convex problems , 2005, Math. Program..

[24]  Yurii Nesterov,et al.  Cubic regularization of Newton method and its global performance , 2006, Math. Program..

[25]  Yurii Nesterov,et al.  Introductory Lectures on Convex Optimization - A Basic Course , 2014, Applied Optimization.

[26]  Arkadi Nemirovski,et al.  Lectures on modern convex optimization - analysis, algorithms, and engineering applications , 2001, MPS-SIAM series on optimization.

[27]  Nicholas I. M. Gould,et al.  Trust Region Methods , 2000, MOS-SIAM Series on Optimization.

[28]  Stephen J. Wright,et al.  Numerical Optimization , 2018, Fundamental Statistical Inference.

[29]  Владимир Юрьевич Протасов,et al.  К вопросу об алгоритмах приближенного вычисления минимума выпуклой функции по ее значениям@@@Algorithms for approximate calculation of the minimum of a convex function from its values , 1996 .

[30]  V. Protasov Algorithms for approximate calculation of the minimum of a convex function from its values , 1996 .

[31]  John Darzentas,et al.  Problem Complexity and Method Efficiency in Optimization , 1983 .

[32]  K. H. Hoffmann,et al.  Higher-order necessary conditions in abstract mathematical programming , 1978 .

[33]  A. O. Gelʹfond Calculus of finite differences , 1973 .

[34]  G. R. Walsh,et al.  Methods Of Optimization , 1976 .