A Study of the Combination of Variation Operators in the NSGA-II Algorithm

Multi-objective evolutionary algorithms rely on the use of variation operators as their basic mechanism to carry out the evolutionary process. These operators are usually fixed and applied in the same way during algorithm execution, e.g., the mutation probability in genetic algorithms. This paper analyses whether a more dynamic approach combining different operators with variable application rate along the search process allows to improve the static classical behavior. This way, we explore the combined use of three different operators (simulated binary crossover, differential evolution’s operator, and polynomial mutation) in the NSGA-II algorithm. We have considered two strategies for selecting the operators: random and adaptive. The resulting variants have been tested on a set of 19 complex problems, and our results indicate that both schemes significantly improve the performance of the original NSGA-II algorithm, achieving the random and adaptive variants the best overall results in the bi- and three-objective considered problems, respectively.

[1]  Jasper A Vrugt,et al.  Improved evolutionary optimization from genetically adaptive multimethod search , 2007, Proceedings of the National Academy of Sciences.

[2]  Ponnuthurai N. Suganthan,et al.  Multi-objective optimization using self-adaptive differential evolution algorithm , 2009, 2009 IEEE Congress on Evolutionary Computation.

[3]  Salvatore Greco,et al.  Evolutionary Multi-Criterion Optimization , 2011, Lecture Notes in Computer Science.

[4]  Kalyanmoy Deb,et al.  Multi-objective test problems, linkages, and evolutionary methodologies , 2006, GECCO.

[5]  Mehmet Fatih Tasgetiren,et al.  Multi-objective optimization based on self-adaptive differential evolution algorithm , 2007, 2007 IEEE Congress on Evolutionary Computation.

[6]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[7]  Christian Blum,et al.  Metaheuristics in combinatorial optimization: Overview and conceptual comparison , 2003, CSUR.

[8]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[9]  Lothar Thiele,et al.  A Tutorial on the Performance Assessment of Stochastic Multiobjective Optimizers , 2006 .

[10]  Xiaodong Li,et al.  Solving Rotated Multi-objective Optimization Problems Using Differential Evolution , 2004, Australian Conference on Artificial Intelligence.

[11]  Qingfu Zhang,et al.  Multiobjective Optimization Problems With Complicated Pareto Sets, MOEA/D and NSGA-II , 2009, IEEE Transactions on Evolutionary Computation.

[12]  Qingfu Zhang,et al.  Multiobjective optimization Test Instances for the CEC 2009 Special Session and Competition , 2009 .

[13]  Lothar Thiele,et al.  Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach , 1999, IEEE Trans. Evol. Comput..

[14]  Carlos A. Coello Coello,et al.  The Micro Genetic Algorithm 2: Towards Online Adaptation in Evolutionary Multiobjective Optimization , 2003, EMO.