Effects of discrete objective functions with different granularities on the search behavior of EMO algorithms

Objective functions in combinatorial optimization are discrete. The number of possible values of a discrete objective function is totally different from problem to problem. Optimization of a discrete objective function is often very difficult. In the case of multiobjective optimization, a different objective function has a different number of possible values. This means that each axis of the objective space has a different granularity. Some axes may have fine granularities while others are coarse. In this paper, we examine the effect of discrete objective functions with different granularities on the search behavior of EMO (evolutionary multiobjective optimization) algorithms through computational experiments. Experimental results show that a discrete objective function with a coarse granularity slows down the search of EMO algorithms along that objective. An interesting observation is that such a slow-down along one objective often leads to the speed-up of the search along other objectives. We also examine the effect of adding a small random noise to each discrete objective function in order to increase the number of possible objective values.

[1]  Kenneth Steiglitz,et al.  Combinatorial Optimization: Algorithms and Complexity , 1981 .

[2]  Andrzej Jaszkiewicz,et al.  On the computational efficiency of multiple objective metaheuristics. The knapsack problem case study , 2004, Eur. J. Oper. Res..

[3]  Kazuhiro Ohkura,et al.  Analysis on topologies of fitness landscapes with both neutrality and ruggedness based on neutral networks , 2009, GECCO '09.

[4]  Nicola Beume,et al.  Pareto-, Aggregation-, and Indicator-Based Methods in Many-Objective Optimization , 2007, EMO.

[5]  Lothar Thiele,et al.  Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach , 1999, IEEE Trans. Evol. Comput..

[6]  Lothar Thiele,et al.  The Hypervolume Indicator Revisited: On the Design of Pareto-compliant Indicators Via Weighted Integration , 2007, EMO.

[7]  Qingfu Zhang,et al.  MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition , 2007, IEEE Transactions on Evolutionary Computation.

[8]  Gary B. Lamont,et al.  Applications Of Multi-Objective Evolutionary Algorithms , 2004 .

[9]  Gary B. Lamont,et al.  Evolutionary Algorithms for Solving Multi-Objective Problems , 2002, Genetic Algorithms and Evolutionary Computation.

[10]  Marco Laumanns,et al.  SPEA2: Improving the strength pareto evolutionary algorithm , 2001 .

[11]  Thomas Jansen,et al.  Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods Evolutionary Algorithms-How to Cope With Plateaus of Constant Fitness and When to Reject Strings of the Same Fitness , 2001 .

[12]  Eckart Zitzler,et al.  Indicator-Based Selection in Multiobjective Search , 2004, PPSN.

[13]  Hisao Ishibuchi,et al.  Effects of Removing Overlapping Solutions on the Performance of the NSGA-II Algorithm , 2005, EMO.

[14]  Frank Neumann,et al.  Benefits and drawbacks for the use of epsilon-dominance in evolutionary multi-objective optimization , 2008, GECCO '08.

[15]  Christian Blum,et al.  Metaheuristics in combinatorial optimization: Overview and conceptual comparison , 2003, CSUR.

[16]  Qingfu Zhang,et al.  Adaptive Operator Selection With Bandits for a Multiobjective Evolutionary Algorithm Based on Decomposition , 2014, IEEE Transactions on Evolutionary Computation.

[17]  睦憲 柳浦,et al.  Combinatorial Optimization : Theory and Algorithms (3rd Edition), B. Korte and J. Vygen 著, 出版社 Springer, 発行 2006年, 全ページ 597頁, 価格 53.45ユーロ, ISBN 3-540-25684-9 , 2006 .

[18]  R. K. Ursem Multi-objective Optimization using Evolutionary Algorithms , 2009 .

[19]  Nicola Beume,et al.  SMS-EMOA: Multiobjective selection based on dominated hypervolume , 2007, Eur. J. Oper. Res..

[20]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[21]  Andrzej Jaszkiewicz,et al.  On the performance of multiple-objective genetic local search on the 0/1 knapsack problem - a comparative experiment , 2002, IEEE Trans. Evol. Comput..

[22]  Jens Vygen,et al.  The Book Review Column1 , 2020, SIGACT News.

[23]  Inman Harvey,et al.  Through the Labyrinth Evolution Finds a Way: A Silicon Ridge , 1996, ICES.