Smoothness Parameter of Power of Euclidean Norm

In this paper, we study derivatives of powers of Euclidean norm. We prove their Hölder continuity and establish explicit expressions for the corresponding constants. We show that these constants are optimal for odd derivatives and at most two times suboptimal for the even ones. In the particular case of integer powers, when the Hölder continuity transforms into the Lipschitz continuity, we improve this result and obtain the optimal constants.

[1]  Bobby Schnabel,et al.  Tensor Methods for Unconstrained Optimization Using Second Derivatives , 1991, SIAM J. Optim..

[2]  Yurii Nesterov,et al.  Interior-point polynomial algorithms in convex programming , 1994, Siam studies in applied mathematics.

[3]  Yurii Nesterov,et al.  Cubic regularization of Newton method and its global performance , 2006, Math. Program..

[4]  Yurii Nesterov,et al.  Accelerating the cubic regularization of Newton’s method on convex problems , 2005, Math. Program..

[5]  M. Baes Estimate sequence methods: extensions and approximations , 2009 .

[6]  Nicholas I. M. Gould,et al.  Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results , 2011, Math. Program..

[7]  Nicholas I. M. Gould,et al.  Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity , 2011, Math. Program..

[8]  Yurii Nesterov,et al.  Universal gradient methods for convex optimization problems , 2015, Math. Program..

[9]  Yair Carmon,et al.  Gradient Descent Efficiently Finds the Cubic-Regularized Non-Convex Newton Step , 2016, ArXiv.

[10]  Aurélien Lucchi,et al.  Sub-sampled Cubic Regularization for Non-convex Optimization , 2017, ICML.

[11]  Yurii Nesterov,et al.  Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians , 2017, SIAM J. Optim..

[12]  Nicholas I. M. Gould,et al.  Improved second-order evaluation complexity for unconstrained nonlinear optimization using high-order regularized models , 2017, ArXiv.

[13]  Peter Richtárik,et al.  Randomized Block Cubic Newton Method , 2018, ICML.

[14]  Katya Scheinberg,et al.  Global convergence rate analysis of unconstrained optimization methods based on probabilistic models , 2015, Mathematical Programming.

[15]  Yin Tat Lee,et al.  Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives , 2019, Annual Conference Computational Learning Theory.

[16]  Y. Nesterov,et al.  Tensor Methods for Minimizing Functions with H\"{o}lder Continuous Higher-Order Derivatives , 2019 .

[17]  Y. Nesterov,et al.  Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives , 2019, SIAM J. Optim..

[18]  Yurii Nesterov,et al.  Implementable tensor methods in unconstrained convex optimization , 2019, Mathematical Programming.

[19]  Yurii Nesterov,et al.  Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method , 2019, Journal of Optimization Theory and Applications.