A classification approach to efficient global optimization in presence of non-computable domains

Gaussian-Process based optimization methods have become very popular in recent years for the global optimization of complex systems with high computational costs. These methods rely on the sequential construction of a statistical surrogate model, using a training set of computed objective function values, which is refined according to a prescribed infilling strategy. However, this sequential optimization procedure can stop prematurely if the objective function cannot be computed at a proposed point. Such a situation can occur when the search space encompasses design points corresponding to an unphysical configuration, an ill-posed problem, or a non-computable problem due to the limitation of numerical solvers. To avoid such a premature stop in the optimization procedure, we propose to use a classification model to learn non-computable areas and to adapt the infilling strategy accordingly. Specifically, the proposed method splits the training set into two subsets composed of computable and non-computable points. A surrogate model for the objective function is built using the training set of computable points, only, whereas a probabilistic classification model is built using the union of the computable and non-computable training sets. The classifier is then incorporated in the surrogate-based optimization procedure to avoid proposing new points in the non-computable domain while improving the classification uncertainty if needed. The method has the advantage to automatically adapt both the surrogate of the objective function and the classifier during the iterative optimization process. Therefore, non-computable areas do not need to be a priori known. The proposed method is applied to several analytical problems presenting different types of difficulty, and to the optimization of a fully nonlinear fluid-structure interaction system. The latter problem concerns the drag minimization of a flexible hydrofoil with cavitation constraints. The efficiency of the proposed method compared favorably to a reference evolutionary algorithm, except for situations where the feasible domain is a small portion of the design space.

[1]  Zhonghua Han,et al.  Efficient aerodynamic shape optimization of transonic wings using a parallel infilling strategy and surrogate models , 2016, Structural and Multidisciplinary Optimization.

[2]  Hsuan-Tien Lin,et al.  A note on Platt’s probabilistic outputs for support vector machines , 2007, Machine Learning.

[3]  John Platt,et al.  Probabilistic Outputs for Support vector Machines and Comparisons to Regularized Likelihood Methods , 1999 .

[4]  K. Johana,et al.  Benchmarking Least Squares Support Vector Machine Classifiers , 2022 .

[5]  Gavin C. Cawley,et al.  Fast exact leave-one-out cross-validation of sparse least-squares support vector machines , 2004, Neural Networks.

[6]  A. Basudhar,et al.  Constrained efficient global optimization with support vector machines , 2012, Structural and Multidisciplinary Optimization.

[7]  XuLei Yang,et al.  Weighted support vector machine for data classification , 2005 .

[8]  Yu Liu,et al.  A sequential sampling strategy to improve the global fidelity of metamodels in multi-level system design , 2016 .

[9]  William J. Welch,et al.  Computer experiments and global optimization , 1997 .

[10]  Raphael T. Haftka,et al.  Remarks on multi-fidelity surrogates , 2016, Structural and Multidisciplinary Optimization.

[11]  Zhenzhou Lu,et al.  An application of the Kriging method in global sensitivity analysis with parameter uncertainty , 2013 .

[12]  Gavin C. Cawley,et al.  Leave-One-Out Cross-Validation Based Model Selection Criteria for Weighted LS-SVMs , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[13]  Anne Auger,et al.  Evolution Strategies , 2018, Handbook of Computational Intelligence.

[14]  Chi-Keong Goh,et al.  Computational Intelligence in Expensive Optimization Problems , 2010 .

[15]  Anil K. Jain,et al.  Data clustering: a review , 1999, CSUR.

[16]  Olivier Roustant,et al.  Calculations of Sobol indices for the Gaussian process metamodel , 2008, Reliab. Eng. Syst. Saf..

[17]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[18]  Régis Duvigneau,et al.  Efficient optimization procedure in non-linear fluid-structure interaction problem: Application to mainsail trimming in upwind conditions , 2017 .

[19]  Gavin C. Cawley,et al.  Efficient leave-one-out cross-validation of kernel fisher discriminant classifiers , 2003, Pattern Recognit..

[20]  C. Rasmussen,et al.  Approximations for Binary Gaussian Process Classification , 2008 .

[21]  N. Zheng,et al.  Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models , 2006, J. Glob. Optim..

[22]  Jack P. C. Kleijnen,et al.  Kriging Metamodeling in Simulation: A Review , 2007, Eur. J. Oper. Res..

[23]  Nikolaus Hansen,et al.  The CMA Evolution Strategy: A Tutorial , 2016, ArXiv.

[24]  Pauli Pedersen,et al.  Some properties of linear strain triangles and optimal finite element models , 1973 .

[25]  Zuomin Dong,et al.  Surrogate-based optimization with clustering-based space exploration for expensive multimodal problems , 2018 .

[26]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[27]  Gavin C. Cawley,et al.  Preventing Over-Fitting during Model Selection via Bayesian Regularisation of the Hyper-Parameters , 2007, J. Mach. Learn. Res..

[28]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[29]  M. Drela XFOIL: An Analysis and Design System for Low Reynolds Number Airfoils , 1989 .

[30]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[31]  Nikolaus Hansen,et al.  The CMA Evolution Strategy: A Comparing Review , 2006, Towards a New Evolutionary Computation.

[32]  Régis Duvigneau,et al.  Flexible hydrofoil optimization for the 35th America's Cup with constrained EGO method , 2017, Ocean Engineering.

[33]  M. Stone Cross‐Validatory Choice and Assessment of Statistical Predictions , 1976 .

[34]  Raphael T. Haftka,et al.  Function Prediction at One Inaccessible Point Using Converging Lines , 2017 .

[35]  Michel Visonneau,et al.  FSI investigation on stability of downwind sails with an automatic dynamic trimming , 2013 .

[36]  Dirk V. Arnold,et al.  A (1+1)-CMA-ES for constrained optimisation , 2012, GECCO '12.

[37]  Timothy W. Simpson,et al.  Metamodels for Computer-based Engineering Design: Survey and recommendations , 2001, Engineering with Computers.

[38]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[39]  Bryan Glaz,et al.  Surrogate based optimization of helicopter rotor blades for vibration reduction in forward flight , 2006 .

[40]  David M. Allen,et al.  The Relationship Between Variable Selection and Data Agumentation and a Method for Prediction , 1974 .

[41]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[42]  Michael Schäfer,et al.  Efficient shape optimization for fluid–structure interaction problems , 2015 .

[43]  Li Liu,et al.  Helicopter vibration reduction throughout the entire flight envelope using surrogate-based optimization , 2007 .

[44]  José C. Páscoa,et al.  XFOIL vs CFD performance predictions for high lift low Reynolds number airfoils , 2016 .

[45]  D. Ginsbourger,et al.  A benchmark of kriging-based infill criteria for noisy optimization , 2013, Structural and Multidisciplinary Optimization.

[46]  Junfeng Gu,et al.  Investigation on parallel algorithms in efficient global optimization based on multiple points infill criterion and domain decomposition , 2016 .

[47]  Sabine Van Huffel,et al.  Comparing Methods for Multi-class Probabilities in Medical Decision Making Using LS-SVMs and Kernel Logistic Regression , 2007, ICANN.