Improving the Flexibility and Robustness of Model-based Derivative-free Optimization Solvers

We present two software packages for derivative-free optimization (DFO): DFO-LS for nonlinear least-squares problems and Py-BOBYQA for general objectives, both with optional bound constraints. Inspired by the Gauss-Newton method, DFO-LS constructs simplified linear regression models for the residuals and allows flexible initialization for expensive problems, whereby it can begin making progress after as few as two objective evaluations. Numerical results show DFO-LS can gain reasonable progress on some medium-scale problems with fewer objective evaluations than is needed for one gradient evaluation. DFO-LS has improved robustness to noise, allowing sample averaging, regression-based model construction, and multiple restart strategies with an auto-detection mechanism. Our extensive numerical experimentation shows that restarting the solver when stagnation is detected is a cheap and effective mechanism for achieving robustness, with superior performance over sampling and regression techniques. The package Py-BOBYQA is a Python implementation of BOBYQA (Powell 2009), with novel features such as the implementation of robustness to noise strategies. Our numerical experiments show that Py-BOBYQA is comparable to or better than existing general DFO solvers for noisy problems. In our comparisons, we introduce an adaptive accuracy measure for data profiles of noisy functions, striking a balance between measuring the true and the noisy objective improvement.

[1]  Nicholas I. M. Gould,et al.  CUTEst: a Constrained and Unconstrained Testing Environment with safe threads for mathematical optimization , 2013, Computational Optimization and Applications.

[2]  Coralia Cartis,et al.  A derivative-free Gauss–Newton method , 2017, Mathematical Programming Computation.

[3]  Y. Marzouk,et al.  NOWPAC: A provably convergent nonlinear optimizer with path-augmented constraints for noisy regimes , 2014 .

[4]  Katya Scheinberg,et al.  Introduction to derivative-free optimization , 2010, Math. Comput..

[5]  Warren Hare,et al.  Methods to compare expensive stochastic optimization algorithms with random restarts , 2018, J. Glob. Optim..

[6]  Luís N. Vicente,et al.  Direct Multisearch for Multiobjective Optimization , 2011, SIAM J. Optim..

[7]  Katya Scheinberg,et al.  Methodologies and software for derivative-free optimization , 2017 .

[8]  Katya Scheinberg,et al.  Stochastic optimization using a trust-region method and random models , 2015, Mathematical Programming.

[9]  Michael C. Ferris,et al.  Adaptation of the Uobyqa Algorithm for Noisy Functions , 2006, Proceedings of the 2006 Winter Simulation Conference.

[10]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[11]  Jorge J. Moré,et al.  Testing Unconstrained Optimization Software , 1981, TOMS.

[12]  M. Powell The BOBYQA algorithm for bound constrained optimization without derivatives , 2009 .

[13]  Stefan M. Wild Chapter 40: POUNDERS in TAO: Solving Derivative-Free Nonlinear Least-Squares Problems with POUNDERS , 2017 .

[14]  Rommel G. Regis,et al.  The calculus of simplex gradients , 2015, Optim. Lett..

[15]  C. T. Kelley,et al.  Detection and Remediation of Stagnation in the Nelder--Mead Algorithm Using a Sufficient Decrease Condition , 1999, SIAM J. Optim..

[16]  Y. Sergeyev,et al.  Operational zones for comparing metaheuristic and deterministic one-dimensional global optimization algorithms , 2017, Math. Comput. Simul..

[17]  M. J. D. Powell,et al.  Least Frobenius norm updating of quadratic models that satisfy interpolation conditions , 2004, Math. Program..

[18]  Jeffrey Larson,et al.  Derivative-Free Optimization of Expensive Functions with Computational Error Using Weighted Regression , 2013, SIAM J. Optim..

[19]  Katya Scheinberg,et al.  Self-Correcting Geometry in Model-Based Algorithms for Derivative-Free Unconstrained Optimization , 2010, SIAM J. Optim..

[20]  Philippe L. Toint,et al.  BFO, A Trainable Derivative-free Brute Force Optimizer for Nonlinear Bound-constrained Optimization and Equilibrium Computations with Continuous and Discrete Variables , 2017, ACM Trans. Math. Softw..

[21]  Y. Marzouk,et al.  A trust-region method for derivative-free nonlinear constrained stochastic optimization , 2017, 1703.04156.

[22]  Jorge J. Moré,et al.  Benchmarking optimization software with performance profiles , 2001, Math. Program..

[23]  James Demmel,et al.  Applied Numerical Linear Algebra , 1997 .

[24]  Charles Audet,et al.  Derivative-Free and Blackbox Optimization , 2017 .

[25]  Shabbir Ahmed,et al.  Advances and Trends in Optimization with Engineering Applications , 2017 .

[26]  Warren Hare,et al.  Best practices for comparing optimization algorithms , 2017, Optimization and Engineering.

[27]  Raghu Pasupathy,et al.  ASTRO-DF: A Class of Adaptive Sampling Trust-Region Algorithms for Derivative-Free Stochastic Optimization , 2016, SIAM J. Optim..

[28]  Stefan M. Wild,et al.  Benchmarking Derivative-Free Optimization Algorithms , 2009, SIAM J. Optim..

[29]  Michael C. Ferris,et al.  Variable-Number Sample-Path Optimization , 2008, Math. Program..

[30]  Stefan M. Wild,et al.  Non-intrusive termination of noisy optimization , 2013, Optim. Methods Softw..

[31]  M. J. D. Powell,et al.  On trust region methods for unconstrained minimization without derivatives , 2003, Math. Program..

[32]  Ya-Xiang Yuan,et al.  A derivative-free trust-region algorithm for composite nonsmooth optimization , 2014, Computational and Applied Mathematics.

[33]  CartisCoralia,et al.  Improving the Flexibility and Robustness of Model-based Derivative-free Optimization Solvers , 2019 .

[34]  Katya Scheinberg,et al.  On the local convergence of a derivative-free algorithm for least-squares minimization , 2010, Computational Optimization and Applications.

[35]  Nicholas I. M. Gould,et al.  Trust Region Methods , 2000, MOS-SIAM Series on Optimization.

[36]  L. N. Vicente,et al.  Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation , 2008 .

[37]  Fabio Schoen,et al.  Global Optimization: Theory, Algorithms, and Applications , 2013 .

[38]  D K Smith,et al.  Numerical Optimization , 2001, J. Oper. Res. Soc..