Performance comparison of EMO algorithms on test problems with different search space shape

We examine the performance of evolutionary multi-objective optimization (EMO) algorithms on various shapes of the search space in the objective space (i.e., the feasible region in the objective space). To analyze the advantage and disadvantage of each EMO algorithm on the shape of the search space, we propose a meta-optimization method which can automatically create multi-objective optimization problems (MOPs) for clarifying the advantage and disadvantage of EMO algorithms. In particular, we propose a two-level model to generate such MOPs. In the upper level, MOPs are handled as solutions. Some design variables of each MOP are optimized in this level. In the lower level, each MOP is used to calculate the relative performance between two EMO algorithms. The relative performance is regarded as the fitness of the MOP in the upper level. Thus, by maximizing the relative performance, we can obtain an MOP which differentiates the search performance between two EMO algorithms. Through computational experiments, we obtained two interesting observations. One is that Pareto dominance-based EMO algorithms have a low escaping ability from local Pareto-optimal regions. The other is that it is difficult for decomposition- and indicator-based EMO algorithms to find solutions along the entire Pareto front.

[1]  Qingfu Zhang,et al.  MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition , 2007, IEEE Transactions on Evolutionary Computation.

[2]  Carlos A. Coello Coello,et al.  Using the Averaged Hausdorff Distance as a Performance Measure in Evolutionary Multiobjective Optimization , 2012, IEEE Transactions on Evolutionary Computation.

[3]  Qingfu Zhang,et al.  Framework for Many-Objective Test Problems with Both Simple and Complicated Pareto-Set Shapes , 2011, EMO.

[4]  R. Lyndon While,et al.  A Scalable Multi-objective Test Problem Toolkit , 2005, EMO.

[5]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[6]  Lothar Thiele,et al.  Multiobjective Optimization Using Evolutionary Algorithms - A Comparative Case Study , 1998, PPSN.

[7]  Michel Gendreau,et al.  Hyper-heuristics: a survey of the state of the art , 2013, J. Oper. Res. Soc..

[8]  Marco Laumanns,et al.  Scalable multi-objective optimization test problems , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[9]  Kalyanmoy Deb,et al.  Real-coded Genetic Algorithms with Simulated Binary Crossover: Studies on Multimodal and Multiobjective Problems , 1995, Complex Syst..

[10]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[11]  M. Hamdan On the Disruption-level of Polynomial Mutation for Evolutionary Multi-objective Optimisation Algorithms , 2010, Comput. Informatics.

[12]  Eckart Zitzler,et al.  Indicator-Based Selection in Multiobjective Search , 2004, PPSN.

[13]  Thomas Stützle,et al.  Improvement Strategies for the F-Race Algorithm: Sampling Design and Iterative Refinement , 2007, Hybrid Metaheuristics.

[14]  Hisao Ishibuchi,et al.  Common properties of scalable multiobjective problems and a new framework of test problems , 2016, 2016 IEEE Congress on Evolutionary Computation (CEC).