Blackbox Optimization in Engineering Design: Adaptive Statistical Surrogates and Direct Search Algorithms

Simulation-based design optimization relies on computational models to evaluate objective and constraint functions. Typical challenges of solving simulation-based design optimization problems include unavailable gradients or unreliable approximations thereof, excessive computational cost, numerical noise, multi-modality and even the models’ failure to return a value. It has become common to use the term “blackbox” for a computational model that features any of these characteristics and/or is inaccessible by the design engineer (i.e., cannot be modified directly to address these issues). A possible remedy for dealing with blackboxes is to use surrogate-based derivative-free optimization methods. However, this has to be done carefully using appropriate formulations and algorithms. In this work, we use the R dynaTree package to build statistical surrogates of the blackboxes and the direct search method for derivative-free optimization. We present different formulations for the surrogate problem considered at each search step of the Mesh Adaptive Direct Search (MADS) algorithm using a surrogate management framework. The proposed formulations are tested on two simulation-based multidisciplinary design optimization problems. Numerical results confirm that the use of statistical surrogates in MADS improves the efficiency of the optimization algorithm.

[1]  Charles Audet,et al.  Nonsmooth optimization through Mesh Adaptive Direct Search and Variable Neighborhood Search , 2006, J. Glob. Optim..

[2]  J. Peraire,et al.  Balanced Model Reduction via the Proper Orthogonal Decomposition , 2002 .

[3]  Sébastien Le Digabel,et al.  The mesh adaptive direct search algorithm with treed Gaussian process surrogates , 2011 .

[4]  Stefan M. Wild,et al.  Variable selection and sensitivity analysis using dynamic trees, with an application to computer code performance tuning , 2011, 1108.4739.

[5]  Humberto Rocha,et al.  Incorporating minimum Frobenius norm models in direct search , 2010, Comput. Optim. Appl..

[6]  Edward I. George,et al.  Bayesian Treed Models , 2002, Machine Learning.

[7]  A. J. Booker,et al.  A rigorous framework for optimization of expensive functions by surrogates , 1998 .

[8]  Katya Scheinberg,et al.  Introduction to derivative-free optimization , 2010, Math. Comput..

[9]  Z. Bai Krylov subspace techniques for reduced-order modeling of large-scale dynamical systems , 2002 .

[10]  D. E. Goldberg,et al.  Genetic Algorithms in Search , 1989 .

[11]  Robert B. Gramacy,et al.  Dynamic Trees for Learning and Design , 2009, 0912.1586.

[12]  H. Chipman,et al.  Bayesian CART Model Search , 1998 .

[13]  J.W. Bandler,et al.  Space mapping: the state of the art , 2004, IEEE Transactions on Microwave Theory and Techniques.

[14]  Charles Audet,et al.  Analysis of Generalized Pattern Searches , 2000, SIAM J. Optim..

[15]  Raphael T. Haftka,et al.  Surrogate-based Analysis and Optimization , 2005 .

[16]  D. E. Goldberg,et al.  Optimization and Machine Learning , 2022 .

[17]  Nicholas G. Polson,et al.  Particle Learning and Smoothing , 2010, 1011.1098.

[18]  Sven Leyffer,et al.  Nonlinear programming without a penalty function , 2002, Math. Program..

[19]  Rhea Patricia Liem,et al.  Surrogate modeling for large-scale black-box systems , 2007 .

[20]  Virginia Torczon,et al.  On the Convergence of Pattern Search Algorithms , 1997, SIAM J. Optim..

[21]  Jean-François Dubé,et al.  Decomposition of multidisciplinary optimization problems: formulations and application to a simplified wing design , 2005 .

[22]  Nicholas G. Polson,et al.  Particle learning for general mixtures , 2010 .

[23]  Farrokh Mistree,et al.  Kriging Models for Global Approximation in Simulation-Based Multidisciplinary Design Optimization , 2001 .

[24]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[25]  Sébastien Le Digabel,et al.  Algorithm xxx : NOMAD : Nonlinear Optimization with the MADS algorithm , 2010 .

[26]  Donald R. Jones,et al.  A Taxonomy of Global Optimization Methods Based on Response Surfaces , 2001, J. Glob. Optim..

[27]  Andy J. Keane,et al.  Recent advances in surrogate-based optimization , 2009 .

[28]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[29]  Stefan M. Wild,et al.  Benchmarking Derivative-Free Optimization Algorithms , 2009, SIAM J. Optim..

[30]  Donald R. Jones,et al.  Global versus local search in constrained optimization of computer models , 1998 .

[31]  D G Krige,et al.  A statistical approach to some mine valuation and allied problems on the Witwatersrand , 2015 .

[32]  John E. Dennis,et al.  A framework for managing models in nonlinear optimization of computationally expensive functions , 1999 .

[33]  Sébastien Le Digabel,et al.  Use of quadratic models with mesh-adaptive direct search for constrained black box optimization , 2011, Optim. Methods Softw..

[34]  Thomas J. Santner,et al.  Sequential design of computer experiments to minimize integrated response functions , 2000 .

[35]  M. D. McKay,et al.  A comparison of three methods for selecting values of input variables in the analysis of output from a computer code , 2000 .

[36]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[37]  Robert B. Gramacy,et al.  Ja n 20 08 Bayesian Treed Gaussian Process Models with an Application to Computer Modeling , 2009 .

[38]  Luís N. Vicente,et al.  A particle swarm pattern search method for bound constrained global optimization , 2007, J. Glob. Optim..

[39]  F. Clarke Optimization And Nonsmooth Analysis , 1983 .

[40]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[41]  Charles Audet,et al.  Mesh Adaptive Direct Search Algorithms for Constrained Optimization , 2006, SIAM J. Optim..