Optimization combining derivative-free global exploration with derivative-based local refinement

This paper proposes a hybrid optimization scheme combining an efficient (and, under the appropriate assumptions, provably globally convergent) derivative-free optimization algorithm (dubbed A-DOGS), to globally explore expensive noncon-vex functions, with a new derivative-based local optimization algorithm, to maximally accelerate local convergence from promising feasible points discovered in parameter space. The resulting hybrid optimization scheme proceeds iteratively, automatically shifting between (derivative-free) global exploration and (derivative-based) local refinement as appropriate. The new derivative-based local refinement method implemented uses the Voronoi partitioning of all existing datapoints at each iteration to establish a “modified trust-region” around the current best point, within which derivative-based optimization is considered. The resulting algorithm is analyzed, and its global convergence is proven under certain assumptions on the objective function. Finally, the algorithm is applied to nonconvex optimization problems with multiple local minima, and its computational cost compared with that of the original A-DOGS algorithm.

[1]  Jasper Snoek,et al.  Practical Bayesian Optimization of Machine Learning Algorithms , 2012, NIPS.

[2]  Sébastien Le Digabel,et al.  Use of quadratic models with mesh-adaptive direct search for constrained black box optimization , 2011, Optim. Methods Softw..

[3]  Robert B. Gramacy,et al.  Ja n 20 08 Bayesian Treed Gaussian Process Models with an Application to Computer Modeling , 2009 .

[4]  Daniele Cavaglieri,et al.  Delaunay-based derivative-free optimization via global surrogates, part I: linear constraints , 2016, J. Glob. Optim..

[5]  Christine A. Shoemaker,et al.  ORBIT: Optimization by Radial Basis Function Interpolation in Trust-Regions , 2008, SIAM J. Sci. Comput..

[6]  Thomas Bewley,et al.  Delaunay-based optimization in CFD leveraging multivariate adaptive polyharmonic splines (MAPS) , 2017 .

[7]  Pooriya Beyhaghi,et al.  Delaunay-based derivative-free optimization via global surrogates, part II: convex constraints , 2016, J. Glob. Optim..

[8]  Dawei He,et al.  Multi-stage algorithm for uncertainty analysis of solar power forecasting , 2016, 2016 IEEE Power and Energy Society General Meeting (PESGM).

[9]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[10]  Thomas Bewley,et al.  Implementation of Cartesian grids to accelerate Delaunay-based derivative-free optimization , 2017, Journal of Global Optimization.

[11]  A. J. Booker,et al.  A rigorous framework for optimization of expensive functions by surrogates , 1998 .

[12]  N. J. A. Sloane,et al.  Sphere Packings, Lattices and Groups , 1987, Grundlehren der mathematischen Wissenschaften.

[13]  Philip E. Gill,et al.  Methods for convex and general quadratic programming , 2014, Mathematical Programming Computation.

[14]  Donald R. Jones,et al.  Global versus local search in constrained optimization of computer models , 1998 .

[15]  C. Stephens,et al.  Global Optimization Requires Global Information , 1998 .