The Impact of Noise on Evaluation Complexity: The Deterministic Trust-Region Case

Intrinsic noise in objective function and derivatives evaluations may cause premature termination of optimization algorithms. Evaluation complexity bounds taking this situation into account are presented in the framework of a deterministic trust-region method. The results show that the presence of intrinsic noise may dominate these bounds, in contrast with what is known for methods in which the inexactness in function and derivatives' evaluations is fully controllable. Moreover, the new analysis provides estimates of the optimality level achievable, should noise cause early termination. It finally sheds some light on the impact of inexact computer arithmetic on evaluation complexity.

[1]  Albert S. Berahas,et al.  Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise , 2019, SIAM J. Optim..

[2]  E. Simon,et al.  An algorithm for the minimization of nonsmooth nonconvex functions using inexact evaluations and its worst-case complexity , 2019, Math. Program..

[3]  Y. Nesterov Gradient methods for minimizing composite objective function , 2007 .

[4]  Stefania Bellavia,et al.  Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy , 2020, Optimization.

[5]  Katya Scheinberg,et al.  Convergence Rate Analysis of a Stochastic Trust-Region Method via Supermartingales , 2016, INFORMS Journal on Optimization.

[6]  Etienne de Klerk,et al.  Worst-Case Examples for Lasserre's Measure-Based Hierarchy for Polynomial Optimization on the Hypercube , 2020, Math. Oper. Res..

[7]  High-order Evaluation Complexity of a Stochastic Adaptive Regularization Algorithm for Nonconvex Optimization Using Inexact Function Evaluations and Randomly Perturbed Derivatives , 2020, 2005.04639.

[8]  Serge Gratton,et al.  A note on solving nonlinear optimization problems in variable precision , 2018, Computational Optimization and Applications.

[9]  S. Bellavia,et al.  Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization , 2018, SIAM J. Optim..

[10]  Ya-Xiang Yuan,et al.  Recent advances in trust region algorithms , 2015, Mathematical Programming.

[11]  Katya Scheinberg,et al.  Convergence of Trust-Region Methods Based on Probabilistic Models , 2013, SIAM J. Optim..

[12]  Philippe L. Toint,et al.  WORST-CASE EVALUATION COMPLEXITY AND OPTIMALITY OF SECOND-ORDER METHODS FOR NONCONVEX SMOOTH OPTIMIZATION , 2017, Proceedings of the International Congress of Mathematicians (ICM 2018).

[13]  P. Toint,et al.  Strong Evaluation Complexity of An Inexact Trust-Region Algorithm for Arbitrary-Order Unconstrained Nonconvex Optimization. , 2020, 2011.00854.

[14]  STEFANIA BELLAVIA,et al.  Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization , 2018, IMA Journal of Numerical Analysis.

[15]  Peng Xu,et al.  Inexact Nonconvex Newton-Type Methods , 2018, INFORMS Journal on Optimization.

[16]  E. D. Klerk,et al.  Convergence analysis of a Lasserre hierarchy of upper bounds for polynomial minimization on the sphere , 2019, Mathematical programming.

[17]  P. Toint,et al.  Strong Evaluation Complexity Bounds for Arbitrary-Order Optimization of Nonconvex Nonsmooth Composite Functions , 2020, 2001.10802.

[18]  M. Laurent,et al.  Improved convergence analysis of Lasserre’s measure-based upper bounds for polynomial minimization on compact sets , 2019, Mathematical programming.

[19]  A Stochastic Line Search Method with Convergence Rate Analysis , 2018, 1807.07994.

[20]  Nicholas I. M. Gould,et al.  Universal regularization methods - varying the power, the smoothness and the accuracy , 2018, 1811.07057.

[21]  Nicholas I. M. Gould,et al.  Worst-case evaluation complexity of regularization methods for smooth unconstrained optimization using Hölder continuous gradients , 2017, Optim. Methods Softw..

[22]  Nicholas I. M. Gould,et al.  Sharp worst-case evaluation complexity bounds for arbitrary-order nonconvex optimization with inexpensive constraints , 2018, SIAM J. Optim..

[23]  Peng Xu,et al.  Newton-type methods for non-convex optimization under inexact Hessian information , 2017, Math. Program..

[24]  Yurii Nesterov,et al.  Universal gradient methods for convex optimization problems , 2015, Math. Program..

[25]  Katya Scheinberg,et al.  Stochastic optimization using a trust-region method and random models , 2015, Mathematical Programming.

[26]  Serge Gratton,et al.  Recursive Trust-Region Methods for Multiscale Nonlinear Optimization , 2008, SIAM J. Optim..

[27]  Yurii Nesterov,et al.  Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians , 2017, SIAM J. Optim..