A Memetic Algorithm Assisted by an Adaptive Topology RBF Network and Variable Local Models for Expensive Optimization Problems

A common practice in modern engineering is that of simulation-driven optimization. This implies replacing costly and lengthy laboratory experiments with computer experiments, i.e. computationally-intensive simulations which model real world physics with high fidelity. Due to the complexity of such simulations a single simulation run can require up to several hours of CPU time of a high-performance computer [45, 56, 61]. With computer experiments the simulation-driven optimization process is cast as a nonlinear optimization problem having three distinct features: There is typically no analytic expression for the relation between inputs (candidate designs) and outputs, i.e. it is a black-box function. Each simulation run is expensive so only a small number (∼ 200) of runs can be made. The underlying real-world physics and/or numerical solution often yield an inputs– output landscape which is multimodal and nonsmooth. A promising approach to tackle such problems is the surrogate-assisted memetic optimization. A memetic algorithm combines an evolutionary algorithm (EA) with an efficient local search so as to obtain both efficient exploration and exploitation during the optimization search [21, 65]. A surrogate-model is a computationally cheaper mathematical approximation of the expensive objective function and is used during the optimization search in lieu of the expensive function [2, 45] (in some references the term metamodel is used synonymously while ‘surrogate-model’ is reserved for a lower-fidelity simulation [42, 87]). Thus, using surrogate-models circumvents the problem of simulation cost and allows evaluation of many candidate designs. In this study we propose a surrogate-assisted memetic algorithm which builds upon recent advances in computational intelligence and optimization [9, 53, 60, 83–85, 94]. The proposed algorithm aims to address four open issues: Obtaining a global model with a small generalization error is too expensive: analysis has shown the number of sites required to achieve a fixed generalization error grows exponentially with the problem dimension [79]. To avoid allocating all function evaluations to the global model we employ a combination of global and local surrogate-models to achieve an efficient optimization search. O pe n A cc es s D at ab as e w w w .ite ch on lin e. co m

[1]  H. B. Mann,et al.  On a Test of Whether one of Two Random Variables is Stochastically Larger than the Other , 1947 .

[2]  Richard Bellman,et al.  Adaptive Control Processes: A Guided Tour , 1961, The Mathematical Gazette.

[3]  W. H. Highleyman,et al.  The design and analysis of pattern recognition experiments , 1962 .

[4]  G. Matheron Principles of geostatistics , 1963 .

[5]  M. Stone Cross‐Validatory Choice and Assessment of Statistical Predictions , 1976 .

[6]  P. Toint Some numerical results using a sparse matrix updating formula in unconstrained optimization , 1978 .

[7]  C. J. Stone,et al.  Optimal Global Rates of Convergence for Nonparametric Regression , 1982 .

[8]  R. Franke Scattered data interpolation: tests of some methods , 1982 .

[9]  K. Mardia,et al.  Maximum likelihood estimation of models for residual covariance in spatial regression , 1984 .

[10]  Roger K. Moore Computer Speech and Language , 1986 .

[11]  C. Micchelli Interpolation of scattered data: Distance matrices and conditionally positive definite functions , 1986 .

[12]  George E. P. Box,et al.  Empirical Model‐Building and Response Surfaces , 1988 .

[13]  John Moody,et al.  Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.

[14]  Aimo A. Törn,et al.  Global Optimization , 1999, Science.

[15]  F. Girosi,et al.  Networks for approximation and learning , 1990, Proc. IEEE.

[16]  G. Box,et al.  Empirical Model-Building and Response Surfaces. , 1990 .

[17]  James D. Keeler,et al.  Layered Neural Networks with Gaussian Hidden Units as Universal Approximations , 1990, Neural Computation.

[18]  Tomaso A. Poggio,et al.  Extensions of a Theory of Networks for Approximation and Learning , 1990, NIPS.

[19]  Mahesan Niranjan,et al.  Neural networks and radial basis functions in classifying static speech patterns , 1990 .

[20]  M. E. Johnson,et al.  Minimax and maximin distance designs , 1990 .

[21]  Jooyoung Park,et al.  Universal Approximation Using Radial-Basis-Function Networks , 1991, Neural Computation.

[22]  Shang-Liang Chen,et al.  Orthogonal least squares learning algorithm for radial basis function networks , 1991, IEEE Trans. Neural Networks.

[23]  John C. Platt A Resource-Allocating Network for Function Interpolation , 1991, Neural Computation.

[24]  Pedro S. de Souza,et al.  Genetic Algorithms in Asynchronous Teams , 1991, ICGA.

[25]  Sukhan Lee,et al.  A Gaussian potential function network with hierarchically self-organizing learning , 1991, Neural Networks.

[26]  Heinz Mühlenbein,et al.  Predictive Models for the Breeder Genetic Algorithm I. Continuous Parameter Optimization , 1993, Evolutionary Computation.

[27]  J. -F. M. Barthelemy,et al.  Approximation concepts for optimum structural design — a review , 1993 .

[28]  Jean-Michel Renders,et al.  Hybridizing genetic algorithms with hill-climbing methods for global optimization: two possible ways , 1994, Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence.

[29]  Adrian G. Bors,et al.  Minimal Topology for a Radial Basis Functions Neural Network for Pattern Classification , 1994 .

[30]  Russell R. Barton,et al.  Metamodeling: a state of the art review , 1994, Proceedings of Winter Simulation Conference.

[31]  Douglas C. Montgomery,et al.  Response Surface Methodology: Process and Product Optimization Using Designed Experiments , 1995 .

[32]  Robert E. Smith,et al.  Fitness inheritance in genetic algorithms , 1995, SAC '95.

[33]  R. Schaback Multivariate Interpolation and Approximation by Translates of a Basis Function the Space of Functions (1.1) Often , 1995 .

[34]  H. N. Mhaskar,et al.  Neural Networks for Optimal Approximation of Smooth and Analytic Functions , 1996, Neural Computation.

[35]  Jean-Michel Renders,et al.  Hybrid methods using genetic algorithms for global optimization , 1996, IEEE Trans. Syst. Man Cybern. Part B.

[36]  T Haftka Raphael,et al.  Multidisciplinary aerospace design optimization: survey of recent developments , 1996 .

[37]  Robert M. Lewitt,et al.  Practical considerations for 3-D image reconstruction using spherically symmetric volume elements , 1996, IEEE Trans. Medical Imaging.

[38]  William E. Hart,et al.  Optimization with genetic algorithm hybrids that use local searches , 1996 .

[39]  Natalia Alexandrov,et al.  Multidisciplinary design optimization : state of the art , 1997 .

[40]  F. Jose,et al.  Convergence of Trust Region Augmented Lagrangian Methods Using Variable Fidelity Approximation Data , 1997 .

[41]  Nicolaos B. Karayiannis,et al.  Growing radial basis neural networks: merging supervised and unsupervised learning with network growth techniques , 1997, IEEE Trans. Neural Networks.

[42]  Katya Scheinberg,et al.  Recent progress in unconstrained nonlinear optimization without derivatives , 1997, Math. Program..

[43]  N. M. Alexandrov,et al.  A trust-region framework for managing the use of approximation models in optimization , 1997 .

[44]  Virginia Torczon,et al.  Using approximations to accelerate engineering design optimization , 1998 .

[45]  Kazuhiro Nakahashi,et al.  Design Optimization of Supersonic Wings Using Evolutionary Algorithms , 1998 .

[46]  John Yen,et al.  A hybrid approach to modeling metabolic systems using a genetic algorithm and simplex method , 1998, IEEE Trans. Syst. Man Cybern. Part B.

[47]  Alain Ratle,et al.  Accelerating the Convergence of Evolutionary Algorithms by Fitness Landscape Approximation , 1998, PPSN.

[48]  Erik D. Goodman,et al.  Evaluation of Injection Island GA Performance on Flywheel Design Optimisation , 1998 .

[49]  H. Sobieczky Parametric Airfoils and Wings , 1999 .

[50]  Aimo A. Törn,et al.  Stochastic Global Optimization: Problem Classes and Solution Techniques , 1999, J. Glob. Optim..

[51]  A. Ratle Optimal sampling strategies for learning a fitness model , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[52]  Shmuel Rippa,et al.  An algorithm for selecting a good value for the parameter c in radial basis function interpolation , 1999, Adv. Comput. Math..

[53]  Mourad Sefrioui,et al.  A Hierarchical Genetic Algorithm Using Multiple Models for Optimization , 2000, PPSN.

[54]  A. Oyama,et al.  Real-Coded Adaptive Range Genetic Algorithm and Its Application to Aerodynamic Design , 2000 .

[55]  T. Simpson,et al.  Comparative studies of metamodeling techniques under multiple modeling criteria , 2000 .

[56]  X. Yao Evolutionary Search of Approximated N-dimensional Landscapes , 2000 .

[57]  C. Poloni,et al.  Hybridization of a multi-objective genetic algorithm, a neural network and a classical optimizer for a complex design problem in fluid dynamics , 2000 .

[58]  Richard J. Beckman,et al.  A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output From a Computer Code , 2000, Technometrics.

[59]  E. Kansa,et al.  Circumventing the ill-conditioning problem with multiquadric radial basis functions: Applications to elliptic partial differential equations , 2000 .

[60]  Timothy W. Simpson,et al.  Metamodels for Computer-based Engineering Design: Survey and recommendations , 2001, Engineering with Computers.

[61]  Akira Oyama,et al.  Real-coded adaptive range genetic algorithm applied to transonic wing optimization , 2000, Appl. Soft Comput..

[62]  Thomas H. Pulliam,et al.  AERODYNAMIC SHAPE OPTIMIZATION AIAA 2001-2473 USING A REAL-NUMBER-EN CODED GENETIC ALGORITHM , 2001 .

[63]  Timothy W. Simpson,et al.  Sampling Strategies for Computer Experiments: Design and Analysis , 2001 .

[64]  Sung-Bae Cho,et al.  An efficient genetic algorithm with less fitness evaluation by clustering , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[65]  T. Simpson,et al.  Computationally Inexpensive Metamodel Assessment Strategies , 2002 .

[66]  Kyriakos C. Giannakoglou,et al.  Design of optimal aerodynamic shapes using stochastic optimization methods and computational intelligence , 2002 .

[67]  Søren Nymand Lophaven,et al.  DACE - A Matlab Kriging Toolbox , 2002 .

[68]  Thomas J. Santner,et al.  The Design and Analysis of Computer Experiments , 2003, Springer Series in Statistics.

[69]  Mitsuo Gen,et al.  Various hybrid methods based on genetic algorithm with fuzzy logic controller , 2003, J. Intell. Manuf..

[70]  A. Keane,et al.  Evolutionary Optimization of Computationally Expensive Problems via Surrogate Modeling , 2003 .

[71]  Marios K. Karakasis,et al.  ON THE USE OF SURROGATE EVALUATION MODELS IN MULTI-OBJECTIVE EVOLUTIONARY ALGORITHMS , 2004 .

[72]  Andy J. Keane,et al.  Meta-Lamarckian learning in memetic algorithms , 2004, IEEE Transactions on Evolutionary Computation.

[73]  M. Carbonaro,et al.  von Karman Institute for Fluid Dynamics , 2004 .

[74]  Ren-Jye Yang,et al.  Approximation methods in multidisciplinary analysis and optimization: a panel discussion , 2004 .

[75]  P. Rousseeuw,et al.  Wiley Series in Probability and Mathematical Statistics , 2005 .

[76]  T. Simpson,et al.  Use of Kriging Models to Approximate Deterministic Computer Models , 2005 .

[77]  Marios K. Karakasis,et al.  METAMODEL-ASSISTED MULTI-OBJECTIVE EVOLUTIONARY OPTIMIZATION , 2005 .

[78]  Yaochu Jin,et al.  A comprehensive survey of fitness approximation in evolutionary computation , 2005, Soft Comput..

[79]  António Gaspar-Cunha,et al.  A Multi-Objective Evolutionary Algorithm Using Neural Networks to Approximate Fitness Evaluations , 2005, Int. J. Comput. Syst. Signals.

[80]  Antonio Filippone,et al.  Flight Performance of Fixed- and Rotary-Wing Aircraft , 2006 .

[81]  Kai-Yew Lum,et al.  Max-min surrogate-assisted evolutionary algorithm for robust design , 2006, IEEE Transactions on Evolutionary Computation.

[82]  Bernd Fritzke,et al.  Fast learning with incremental RBF networks , 1994, Neural Processing Letters.

[83]  Bu-Sung Lee,et al.  Memetic algorithm using multi-surrogates for computationally expensive optimization problems , 2007, Soft Comput..

[84]  J. Dennis,et al.  MANAGING APPROXIMATION MODELS IN OPTIMIZATION , 2007 .

[85]  Andy J. Keane,et al.  Combining Global and Local Surrogate Models to Accelerate Evolutionary Optimization , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[86]  Yoel Tenne,et al.  A Memetic Algorithm Using a Trust-Region Derivative-Free Optimization with Quadratic Modelling for Optimization of Expensive and Noisy Black-box Functions , 2007, Evolutionary Computation in Dynamic and Uncertain Environments.

[87]  Yoel Tenne,et al.  A Versatile Surrogate-Assisted Memetic Algorithm for Optimization of Computationally Expensive Functions and its Engineering Applications , 2008 .

[88]  Yoel Tenne,et al.  Metamodel accuracy assessment in evolutionary optimization , 2008, 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence).

[89]  Sonja Kuhnt,et al.  Design and analysis of computer experiments , 2010 .

[90]  S. De Marchi,et al.  On Optimal Center Locations for Radial Basis Function Interpolation: Computational Aspects , 2022 .