Memetic algorithm using multi-surrogates for computationally expensive optimization problems

In this paper, we present a multi-surrogates assisted memetic algorithm for solving optimization problems with computationally expensive fitness functions. The essential backbone of our framework is an evolutionary algorithm coupled with a local search solver that employs multi-surrogate in the spirit of Lamarckian learning. Inspired by the notion of ‘blessing and curse of uncertainty’ in approximation models, we combine regression and exact interpolating surrogate models in the evolutionary search. Empirical results are presented for a series of commonly used benchmark problems to demonstrate that the proposed framework converges to good solution quality more efficiently than the standard genetic algorithm, memetic algorithm and surrogate-assisted memetic algorithms.

[1]  Francisco Herrera,et al.  Hybrid crossover operators for real-coded genetic algorithms: an experimental study , 2005, Soft Comput..

[2]  Yew-Soon Ong,et al.  A domain knowledge based search advisor for design problem solving environments , 2002 .

[3]  Fred H. Lesh,et al.  Multi-dimensional least-squares polynomial curve fitting , 1959, CACM.

[4]  Joshua D. Knowles,et al.  ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems , 2006, IEEE Transactions on Evolutionary Computation.

[5]  Yew-Soon Ong,et al.  Curse and Blessing of Uncertainty in Evolutionary Algorithm Using Approximation , 2006, 2006 IEEE International Conference on Evolutionary Computation.

[6]  Kai-Yew Lum,et al.  Max-min surrogate-assisted evolutionary algorithm for robust design , 2006, IEEE Transactions on Evolutionary Computation.

[7]  Yew-Soon Ong,et al.  Hybrid evolutionary algorithm with Hermite radial basis function interpolants for computationally expensive adjoint solvers , 2008, Comput. Optim. Appl..

[8]  J. Mason,et al.  Algorithms for approximation , 1987 .

[9]  Bernhard Sendhoff,et al.  A framework for evolutionary optimization with approximate fitness functions , 2002, IEEE Trans. Evol. Comput..

[10]  Alain Ratle,et al.  Kriging as a surrogate fitness landscape in evolutionary optimization , 2001, Artificial Intelligence for Engineering Design, Analysis and Manufacturing.

[11]  Bernhard Sendhoff,et al.  Reducing Fitness Evaluations Using Clustering Techniques and Neural Network Ensembles , 2004, GECCO.

[12]  A. Keane,et al.  Evolutionary Optimization of Computationally Expensive Problems via Surrogate Modeling , 2003 .

[13]  Kok Wai Wong,et al.  Surrogate-Assisted Evolutionary Optimization Frameworks for High-Fidelity Engineering Design Problems , 2005 .

[14]  Xuan Jiang Constrained Multi-Objective GA Optimization Using Reduced Models , 2003 .

[15]  Bu-Sung Lee,et al.  A Multi-cluster Grid Enabled Evolution Framework for Aerodynamic Airfoil Design Optimization , 2005, ICNC.

[16]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[17]  Andy J. Keane,et al.  Meta-Lamarckian learning in memetic algorithms , 2004, IEEE Transactions on Evolutionary Computation.

[18]  Christine A. Shoemaker,et al.  Local function approximation in evolutionary algorithms for the optimization of costly functions , 2004, IEEE Transactions on Evolutionary Computation.

[19]  Dimitri N. Mavris,et al.  New Approaches to Conceptual and Preliminary Aircraft Design: A Comparative Assessment of a Neural Network Formulation and a Response Surface Methodology , 1998 .

[20]  Andy J. Keane,et al.  Metamodeling Techniques For Evolutionary Optimization of Computationally Expensive Problems: Promises and Limitations , 1999, GECCO.

[21]  A. L. Edwards,et al.  An introduction to linear regression and correlation. , 1985 .

[22]  Andy J. Keane,et al.  Combining Global and Local Surrogate Models to Accelerate Evolutionary Optimization , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[23]  Kalyanmoy Deb,et al.  Computationally effective search and optimization procedure using coarse to fine approximations , 2003, The 2003 Congress on Evolutionary Computation, 2003. CEC '03..

[24]  Petros Koumoutsakos,et al.  Accelerating evolutionary algorithms with Gaussian process fitness function models , 2005, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[25]  M. J. D. Powell,et al.  Radial basis functions for multivariable interpolation: a review , 1987 .

[26]  Thomas Bäck,et al.  Metamodel-Assisted Evolution Strategies , 2002, PPSN.

[27]  Craig T. Lawrence,et al.  A Computationally Efficient Feasible Sequential Quadratic Programming Algorithm , 2000, SIAM J. Optim..

[28]  Yaochu Jin,et al.  A comprehensive survey of fitness approximation in evolutionary computation , 2005, Soft Comput..

[29]  Andreas Zell,et al.  Evolution strategies assisted by Gaussian processes with improved preselection criterion , 2003, The 2003 Congress on Evolutionary Computation, 2003. CEC '03..

[30]  Kyriakos C. Giannakoglou,et al.  Design of optimal aerodynamic shapes using stochastic optimization methods and computational intelligence , 2002 .

[31]  T. W. Layne,et al.  A Comparison of Approximation Modeling Techniques: Polynomial Versus Interpolating Models , 1998 .