Accumulative sampling for noisy evolutionary multi-objective optimization

Objective evaluation is subject to noise in many real-world problems. The noise can deteriorate the performance of multi-objective evolutionary algorithms, by misleading the population to a local optimum and reducing the convergence rate. This paper proposes three novel noise handling techniques: accumulative sampling, a new ranking method, and a different selection scheme for recombination. The accumulative sampling is basically a kind of dynamic resampling, but it does not explicitly decide the number of samples. Instead, it repeatedly takes additional samples of objectives for the solutions in the archive at every generation, and updates the estimated objectives using all the accumulated samples. The new ranking method combines probabilistic Pareto rank and crowding distance into a single aggregated value to promote the diversity in the archive. Finally, the fitness function and selection method used for recombination are made different from those for the archive to accelerate the convergence rate. Experiments on various benchmark problems have shown that the algorithm adopting all these features performs better than other MOEAs in various performance metrics.

[1]  Kay Chen Tan,et al.  An Investigation on Noisy Environments in Evolutionary Multiobjective Optimization , 2007, IEEE Transactions on Evolutionary Computation.

[2]  Hamidreza Eskandari,et al.  Handling uncertainty in evolutionary multiobjective optimization: SPGA , 2007, 2007 IEEE Congress on Evolutionary Computation.

[3]  Robert Ivor John,et al.  Evolutionary optimisation of noisy multi-objective problems using confidence-based dynamic resampling , 2010, Eur. J. Oper. Res..

[4]  Hamidreza Eskandari,et al.  Evolutionary multiobjective optimization in noisy problem environments , 2009, J. Heuristics.

[5]  Evan J. Hughes,et al.  Evolutionary Multi-objective Ranking with Uncertainty and Noise , 2001, EMO.

[6]  Hussein A. Abbass,et al.  Fitness inheritance for noisy evolutionary multi-objective optimization , 2005, GECCO '05.

[7]  Magnus Rattray,et al.  Noisy Fitness Evaluation in Genetic Algorithms and the Dynamics of Learning , 1996, FOGA.

[8]  Benjamin W. Wah,et al.  Dynamic Control of Genetic Algorithms in a Noisy Environment , 1993, ICGA.

[9]  Benjamin W. Wah,et al.  Scheduling of Genetic Algorithms in a Noisy Environment , 1994, Evolutionary Computation.

[10]  P. Koumoutsakos,et al.  Multiobjective evolutionary algorithm for the optimization of noisy combustion processes , 2002 .

[11]  Jürgen Teich,et al.  Pareto-Front Exploration with Uncertain Objectives , 2001, EMO.

[12]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[13]  Hajime Kita,et al.  Optimization of noisy fitness functions by means of genetic algorithms using history of search with test of estimation , 2000, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[14]  Marco Laumanns,et al.  SPEA2: Improving the strength pareto evolutionary algorithm , 2001 .

[15]  T. Back,et al.  Thresholding-a selection operator for noisy ES , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[16]  Lothar Thiele,et al.  Comparison of Multiobjective Evolutionary Algorithms: Empirical Results , 2000, Evolutionary Computation.

[17]  J. Fitzpatrick,et al.  Genetic Algorithms in Noisy Environments , 2005, Machine Learning.

[18]  Kalyanmoy Deb,et al.  Muiltiobjective Optimization Using Nondominated Sorting in Genetic Algorithms , 1994, Evolutionary Computation.