Evaluating selection methods on hyper-heuristic multi-objective particle swarm optimization

Multi-objective particle swarm optimization (MOPSO) is a promising meta-heuristic to solve multi-objective problems (MOPs). Previous works have shown that selecting a proper combination of leader and archiving methods, which is a challenging task, improves the search ability of the algorithm. A previous study has employed a simple hyper-heuristic to select these components, obtaining good results. In this research, an analysis is made to verify if using more advanced heuristic selection methods improves the search ability of the algorithm. Empirical studies are conducted to investigate this hypothesis. In these studies, first, four heuristic selection methods are compared: a choice function, a multi-armed bandit, a random one, and the previously proposed roulette wheel. A second study is made to identify if it is best to adapt only the leader method, the archiving method, or both simultaneously. Moreover, the influence of the interval used to replace the low-level heuristic is analyzed. At last, a final study compares the best variant to a hyper-heuristic framework that combines a Multi-Armed Bandit algorithm into the multi-objective optimization based on decomposition with dynamical resource allocation (MOEA/D-DRA) and a state-of-the-art MOPSO. Our results indicate that the resulting algorithm outperforms the hyper-heuristic framework in most of the problems investigated. Moreover, it achieves competitive results compared to a state-of-the-art MOPSO.

[1]  Kalyanmoy Deb,et al.  An Evolutionary Many-Objective Optimization Algorithm Using Reference-Point-Based Nondominated Sorting Approach, Part I: Solving Problems With Box Constraints , 2014, IEEE Transactions on Evolutionary Computation.

[2]  Michèle Sebag,et al.  Adaptive Operator Selection and Management in Evolutionary Algorithms , 2012, Autonomous Search.

[3]  Dipti Srinivasan,et al.  A Survey of Multiobjective Evolutionary Algorithms Based on Decomposition , 2017, IEEE Transactions on Evolutionary Computation.

[4]  Jun Zhang,et al.  An External Archive-Guided Multiobjective Particle Swarm Optimization Algorithm , 2017, IEEE Transactions on Cybernetics.

[5]  Aurora Trinidad Ramirez Pozo,et al.  Using archiving methods to control convergence and diversity for Many-Objective Problems in Particle Swarm Optimization , 2012, 2012 IEEE Congress on Evolutionary Computation.

[6]  Gian Mauricio Fritsche,et al.  A Hyper-Heuristic for the Multi-Objective Integration and Test Order Problem , 2015, GECCO.

[7]  Michèle Sebag,et al.  Analyzing bandit-based adaptive operator selection mechanisms , 2010, Annals of Mathematics and Artificial Intelligence.

[8]  Heike Trautmann,et al.  On the properties of the R2 indicator , 2012, GECCO '12.

[9]  Slim Bechikh,et al.  A New Decomposition-Based NSGA-II for Many-Objective Optimization , 2018, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[10]  Michael N. Vrahatis,et al.  Multi-Objective Particles Swarm Optimization Approaches , 2008 .

[11]  Álvaro Fialho,et al.  Multi-Objective Differential Evolution with Adaptive Control of Parameters and Operators , 2011, LION.

[12]  Qingfu Zhang,et al.  An Evolutionary Many-Objective Optimization Algorithm Based on Dominance and Decomposition , 2015, IEEE Transactions on Evolutionary Computation.

[13]  Riccardo Poli,et al.  Particle swarm optimization , 1995, Swarm Intelligence.

[14]  M Reyes Sierra,et al.  Multi-Objective Particle Swarm Optimizers: A Survey of the State-of-the-Art , 2006 .

[15]  W. Kruskal,et al.  Use of Ranks in One-Criterion Variance Analysis , 1952 .

[16]  Marco Laumanns,et al.  Stochastic convergence of random search methods to fixed size Pareto front approximations , 2011, Eur. J. Oper. Res..

[17]  Yuhui Shi,et al.  Particle swarm optimization: developments, applications and resources , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[18]  Graham Kendall,et al.  A Hyperheuristic Approach to Scheduling a Sales Summit , 2000, PATAT.

[19]  Qingfu Zhang,et al.  The performance of a new version of MOEA/D on CEC09 unconstrained MOP test instances , 2009, 2009 IEEE Congress on Evolutionary Computation.

[20]  Edmund K. Burke,et al.  An Improved Choice Function Heuristic Selection for Cross Domain Heuristic Search , 2012, PPSN.

[21]  Aurora Trinidad Ramirez Pozo,et al.  A Multi-armed Bandit Hyper-Heuristic , 2015, 2015 Brazilian Conference on Intelligent Systems (BRACIS).

[22]  Tobias Friedrich,et al.  An Efficient Algorithm for Computing Hypervolume Contributions , 2010, Evolutionary Computation.

[23]  Junichi Suzuki,et al.  R2-IBEA: R2 indicator based evolutionary algorithm for multiobjective optimization , 2013, 2013 IEEE Congress on Evolutionary Computation.

[24]  Jie Zhang,et al.  Consistencies and Contradictions of Performance Metrics in Multiobjective Optimization , 2014, IEEE Transactions on Cybernetics.

[25]  Carlos A. Coello Coello,et al.  Multi-Objective Particle Swarm Optimizers: An Experimental Comparison , 2009, EMO.

[26]  M. Friedman The Use of Ranks to Avoid the Assumption of Normality Implicit in the Analysis of Variance , 1937 .

[27]  Kalyanmoy Deb,et al.  Faster Hypervolume-Based Search Using Monte Carlo Sampling , 2008, MCDM.

[28]  Kalyanmoy Deb,et al.  A Fast Elitist Non-dominated Sorting Genetic Algorithm for Multi-objective Optimisation: NSGA-II , 2000, PPSN.

[29]  Lothar Thiele,et al.  Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach , 1999, IEEE Trans. Evol. Comput..

[30]  Carolina P. de Almeida,et al.  MOEA/D-HH: A Hyper-Heuristic for Multi-objective Problems , 2015, EMO.

[31]  Jürgen Branke,et al.  Empirical comparison of MOPSO methods - Guide selection and diversity preservation - , 2009, 2009 IEEE Congress on Evolutionary Computation.

[32]  Aurora Trinidad Ramirez Pozo,et al.  A Comparison of methods for leader selection in many-objective problems , 2012, 2012 IEEE Congress on Evolutionary Computation.

[33]  Andries Petrus Engelbrecht,et al.  Heterogeneous dynamic vector evaluated particle swarm optimisation for dynamic multi-objective optimisation , 2014, 2014 IEEE Congress on Evolutionary Computation (CEC).

[34]  Xin Yao,et al.  A New Dominance Relation-Based Evolutionary Algorithm for Many-Objective Optimization , 2016, IEEE Transactions on Evolutionary Computation.

[35]  Aurora Trinidad Ramirez Pozo,et al.  Product selection based on upper confidence bound MOEA/D-DRA for testing software product lines , 2016, 2016 IEEE Congress on Evolutionary Computation (CEC).

[36]  Qingfu Zhang,et al.  Adaptive Operator Selection With Bandits for a Multiobjective Evolutionary Algorithm Based on Decomposition , 2014, IEEE Transactions on Evolutionary Computation.

[37]  Aurora Trinidad Ramirez Pozo,et al.  Using Hyper-Heuristic to Select Leader and Archiving Methods for Many-Objective Problems , 2015, EMO.

[38]  Edmund K. Burke,et al.  Examination timetabling using late acceptance hyper-heuristics , 2009, 2009 IEEE Congress on Evolutionary Computation.

[39]  Graham Kendall,et al.  A multi-objective hyper-heuristic based on choice function , 2014, Expert Syst. Appl..

[40]  M. Hansen,et al.  Evaluating the quality of approximations to the non-dominated set , 1998 .

[41]  Helio J. C. Barbosa,et al.  Adaptive Operator Selection at the Hyper-level , 2012, PPSN.

[42]  Lucas Bradstreet,et al.  A Fast Way of Calculating Exact Hypervolumes , 2012, IEEE Transactions on Evolutionary Computation.

[43]  Yue Shi,et al.  A modified particle swarm optimizer , 1998, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98TH8360).

[44]  Graham Kendall,et al.  A Dynamic Multiarmed Bandit-Gene Expression Programming Hyper-Heuristic for Combinatorial Optimization Problems , 2015, IEEE Transactions on Cybernetics.

[45]  Aurora Trinidad Ramirez Pozo,et al.  Upper Confidence Bound (UCB) Algorithms for Adaptive Operator Selection in MOEA/D , 2015, EMO.

[46]  Graham Kendall,et al.  A hyper-heuristic approach to sequencing by hybridization of DNA sequences , 2013, Ann. Oper. Res..

[47]  Aurora Pozo,et al.  MOEA/D with adaptive operator selection for the environmental/economic dispatch problem , 2015, 2015 Latin America Congress on Computational Intelligence (LA-CCI).

[48]  Jürgen Teich,et al.  Strategies for finding good local guides in multi-objective particle swarm optimization (MOPSO) , 2003, Proceedings of the 2003 IEEE Swarm Intelligence Symposium. SIS'03 (Cat. No.03EX706).

[49]  R. Lyndon While,et al.  A review of multiobjective test problems and a scalable test problem toolkit , 2006, IEEE Transactions on Evolutionary Computation.

[50]  Carlos A. Coello Coello,et al.  Solving Multiobjective Optimization Problems Using an Artificial Immune System , 2005, Genetic Programming and Evolvable Machines.

[51]  Enrique Alba,et al.  SMPSO: A new PSO-based metaheuristic for multi-objective optimization , 2009, 2009 IEEE Symposium on Computational Intelligence in Multi-Criteria Decision-Making(MCDM).

[52]  Michel Gendreau,et al.  Hyper-heuristics: a survey of the state of the art , 2013, J. Oper. Res. Soc..

[53]  Marco Laumanns,et al.  Scalable multi-objective optimization test problems , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[54]  Daniel Selva,et al.  A Classification and Comparison of Credit Assignment Strategies in Multiobjective Adaptive Operator Selection , 2017, IEEE Transactions on Evolutionary Computation.

[55]  Ender Özcan,et al.  An Experimental Study on Hyper-heuristics and Exam Timetabling , 2006, PATAT.

[56]  Gary B. Lamont,et al.  Evolutionary Algorithms for Solving Multi-Objective Problems (Genetic and Evolutionary Computation) , 2006 .

[57]  Shengxiang Yang,et al.  Diversity Comparison of Pareto Front Approximations in Many-Objective Optimization , 2014, IEEE Transactions on Cybernetics.

[58]  Aurora Trinidad Ramirez Pozo,et al.  A MOPSO based on hyper-heuristic to optimize many-objective problems , 2014, 2014 IEEE Symposium on Swarm Intelligence.