Update-based evolution control: A new fitness approximation method for evolutionary algorithms

Evolutionary algorithms are robust optimization methods that have been used in many engineering applications. However, real-world fitness evaluations can be computationally expensive, so it may be necessary to estimate the fitness with an approximate model. This article reviews design and analysis of computer experiments (DACE) as an approximation method that combines a global polynomial with a local Gaussian model to estimate continuous fitness functions. The article incorporates DACE in various evolutionary algorithms, to test unconstrained and constrained benchmarks, both with and without fitness function evaluation noise. The article also introduces a new evolution control strategy called update-based control that estimates the fitness of certain individuals of each generation based on the exact fitness values of other individuals during that same generation. The results show that update-based evolution control outperforms other strategies on noise-free, noisy, constrained and unconstrained benchmarks. The results also show that update-based evolution control can compensate for fitness evaluation noise.

[1]  Gary B. Fogel,et al.  Noisy optimization problems - a particular challenge for differential evolution? , 2004, Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753).

[2]  Bernhard Sendhoff,et al.  On Evolutionary Optimization with Approximate Fitness Functions , 2000, GECCO.

[3]  Bernhard Sendhoff,et al.  A framework for evolutionary optimization with approximate fitness functions , 2002, IEEE Trans. Evol. Comput..

[4]  P. N. Suganthan,et al.  Differential Evolution: A Survey of the State-of-the-Art , 2011, IEEE Transactions on Evolutionary Computation.

[5]  W. Carpenter,et al.  A comparison of polynomial approximations and artificial neural nets as response surfaces , 1993 .

[6]  Luigi Fortuna,et al.  Evolutionary Optimization Algorithms , 2001 .

[7]  Jing J. Liang,et al.  Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization , 2005 .

[8]  Francisco Herrera,et al.  A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms , 2011, Swarm Evol. Comput..

[9]  Ferrante Neri,et al.  A memetic Differential Evolution approach in noisy optimization , 2010, Memetic Comput..

[10]  Arthur C. Sanderson,et al.  JADE: Adaptive Differential Evolution With Optional External Archive , 2009, IEEE Transactions on Evolutionary Computation.

[11]  Michèle Sebag,et al.  Extending Population-Based Incremental Learning to Continuous Search Spaces , 1998, PPSN.

[12]  Jürgen Branke,et al.  Faster convergence by means of fitness estimation , 2005, Soft Comput..

[13]  Yaochu Jin,et al.  Surrogate-assisted evolutionary computation: Recent advances and future challenges , 2011, Swarm Evol. Comput..

[14]  Jacek M. Zurada,et al.  Swarm and Evolutionary Computation , 2012, Lecture Notes in Computer Science.

[15]  Maumita Bhattacharya Reduced computation for evolutionary optimization in noisy environment , 2008, GECCO '08.

[16]  Thomas J. Santner,et al.  The Design and Analysis of Computer Experiments , 2003, Springer Series in Statistics.

[17]  J. A. Lozano,et al.  Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation , 2001 .

[18]  Yaochu Jin,et al.  Managing approximate models in evolutionary aerodynamic design optimization , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[19]  Jing J. Liang,et al.  Problem Deflnitions and Evaluation Criteria for the CEC 2006 Special Session on Constrained Real-Parameter Optimization , 2006 .

[20]  Maumita Bhattacharya,et al.  Surrogate based EA for expensive optimization problems , 2007, 2007 IEEE Congress on Evolutionary Computation.

[21]  Ponnuthurai N. Suganthan,et al.  Self-adaptive differential evolution with multi-trajectory search for large-scale optimization , 2011, Soft Comput..

[22]  Tim Hendtlass,et al.  Developments in applied artificial intelligence: proceedings of the 15th International Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems (IEA/AIE 2002), Cairns, Queensland, Australia, 17-20 June 2002 , 2002 .

[23]  Hang Zhang,et al.  Best approximations of fitness functions of binary strings , 2004, Natural Computing.

[24]  Haym Hirsh,et al.  Informed operators: Speeding up genetic-algorithm-based design optimization using reduced models , 2000, GECCO.

[25]  James Kennedy,et al.  The particle swarm: social adaptation of knowledge , 1997, Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97).

[26]  K. Deb An Efficient Constraint Handling Method for Genetic Algorithms , 2000 .

[27]  J. Fitzpatrick,et al.  Genetic Algorithms in Noisy Environments , 2005, Machine Learning.

[28]  Xin Yao,et al.  Empirical analysis of evolutionary algorithms with immigrants schemes for dynamic optimization , 2009, Memetic Comput..

[29]  In Schoenauer,et al.  Parallel Problem Solving from Nature , 1990, Lecture Notes in Computer Science.

[30]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[31]  Bernhard Sendhoff,et al.  Reducing Fitness Evaluations Using Clustering Techniques and Neural Network Ensembles , 2004, GECCO.

[32]  Bernhard Sendhoff,et al.  Fitness Approximation In Evolutionary Computation - a Survey , 2002, GECCO.

[33]  Siang Yew Chong,et al.  Centroid-based memetic algorithm – adaptive Lamarckian and Baldwinian learning , 2012, Int. J. Syst. Sci..

[34]  Hans-Paul Schwefel,et al.  Evolution and optimum seeking , 1995, Sixth-generation computer technology series.

[35]  Petros Koumoutsakos,et al.  A Method for Handling Uncertainty in Evolutionary Optimization With an Application to Feedback Control of Combustion , 2009, IEEE Transactions on Evolutionary Computation.

[36]  David E. Goldberg,et al.  Hierarchical Bayesian Optimization Algorithm , 2006, Scalable Optimization via Probabilistic Modeling.

[37]  Yaochu Jin,et al.  A comprehensive survey of fitness approximation in evolutionary computation , 2005, Soft Comput..

[38]  Sonja Kuhnt,et al.  Design and analysis of computer experiments , 2010 .

[39]  Anna Esparcia-Alcázar,et al.  Fitness approximation for bot evolution in genetic programming , 2013, Soft Comput..

[40]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[41]  Jürgen Branke,et al.  Evolutionary optimization in uncertain environments-a survey , 2005, IEEE Transactions on Evolutionary Computation.

[42]  Bernhard Sendhoff,et al.  Robust Optimization - A Comprehensive Survey , 2007 .

[43]  Mehrdad Salami,et al.  A Fitness Estimation Strategy for Genetic Algorithms , 2002, IEA/AIE.

[44]  Peter J. Fleming,et al.  The Stud GA: A Mini Revolution? , 1998, PPSN.