What Can We Learn from Multi-Objective Meta-Optimization of Evolutionary Algorithms in Continuous Domains?

Properly configuring Evolutionary Algorithms (EAs) is a challenging task made difficult by many different details that affect EAs’ performance, such as the properties of the fitness function, time and computational constraints, and many others. EAs’ meta-optimization methods, in which a metaheuristic is used to tune the parameters of another (lower-level) metaheuristic which optimizes a given target function, most often rely on the optimization of a single property of the lower-level method. In this paper, we show that by using a multi-objective genetic algorithm to tune an EA, it is possible not only to find good parameter sets considering more objectives at the same time but also to derive generalizable results which can provide guidelines for designing EA-based applications. In particular, we present a general framework for multi-objective meta-optimization, to show that “going multi-objective” allows one to generate configurations that, besides optimally fitting an EA to a given problem, also perform well on previously unseen ones.

[1]  Stefano Cagnoni,et al.  Algorithm configuration using GPU-based metaheuristics , 2013, GECCO '13 Companion.

[2]  Leslie Pérez Cáceres,et al.  The irace package: Iterated racing for automatic algorithm configuration , 2016 .

[3]  Kate Smith-Miles,et al.  Measuring algorithm footprints in instance space , 2012, 2012 IEEE Congress on Evolutionary Computation.

[4]  Stefano Cagnoni,et al.  Multi-objective Parameter Tuning for PSO-based Point Cloud Localization , 2014, WIVACE.

[5]  Jing J. Liang,et al.  Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization , 2005 .

[6]  P. N. Suganthan,et al.  Differential Evolution: A Survey of the State-of-the-Art , 2011, IEEE Transactions on Evolutionary Computation.

[7]  Sean Luke,et al.  Is the meta-EA a viable optimization method? , 2013, GECCO '13.

[8]  Jason H. Moore,et al.  Investigating the parameter space of evolutionary algorithms , 2017, BioData Mining.

[9]  A. E. Eiben,et al.  An MOEA-based Method to Tune EA Parameters on Multiple Objective Functions , 2010, IJCCI.

[10]  A. E. Eiben,et al.  Parameter Tuning of Evolutionary Algorithms: Generalist vs. Specialist , 2010, EvoApplications.

[11]  James Kennedy,et al.  Particle swarm optimization , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.

[12]  Laetitia Vermeulen-Jourdan,et al.  Automatically Configuring Multi-objective Local Search Using Multi-objective Optimisation , 2017, EMO.

[13]  A. E. Eiben,et al.  Beating the ‘world champion’ evolutionary algorithm via REVAC tuning , 2010, IEEE Congress on Evolutionary Computation.

[14]  Aravind Srinivasan,et al.  Innovization: innovating design principles through optimization , 2006, GECCO.

[15]  J. A. Lozano,et al.  Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation , 2001 .

[16]  A. E. Eiben,et al.  Comparing parameter tuning methods for evolutionary algorithms , 2009, 2009 IEEE Congress on Evolutionary Computation.

[17]  A. E. Eiben,et al.  Efficient relevance estimation and value calibration of evolutionary algorithm parameters , 2007, 2007 IEEE Congress on Evolutionary Computation.

[18]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[19]  Thomas Bartz-Beielstein,et al.  Sequential parameter optimization , 2005, 2005 IEEE Congress on Evolutionary Computation.

[20]  Leslie Pérez Cáceres,et al.  Are State-of-the-Art Fine-Tuning Algorithms Able to Detect a Dummy Parameter? , 2012, PPSN.

[21]  Anna Syberfeldt,et al.  Parameter Tuning of MOEAs Using a Bilevel Optimization Approach , 2015, EMO.

[22]  Gisbert Schneider,et al.  Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training , 2006, BMC Bioinformatics.

[23]  Michael Affenzeller,et al.  A Comprehensive Survey on Fitness Landscape Analysis , 2012, Recent Advances in Intelligent Engineering Systems.

[24]  Heike Trautmann,et al.  MO-ParamILS: A Multi-objective Automatic Algorithm Configuration Framework , 2016, LION.

[25]  Jürgen Branke,et al.  Meta-optimization for parameter tuning with a flexible computing budget , 2012, GECCO '12.

[26]  María Cristina Riff,et al.  Tuners review: How crucial are set-up values to find effective parameter values? , 2018, Eng. Appl. Artif. Intell..

[27]  A. E. Eiben,et al.  Parameter tuning for configuring and analyzing evolutionary algorithms , 2011, Swarm Evol. Comput..

[28]  Mark Hoogendoorn,et al.  Parameter Control in Evolutionary Algorithms: Trends and Challenges , 2015, IEEE Transactions on Evolutionary Computation.

[29]  Johann Dréo,et al.  Using performance fronts for parameter setting of stochastic metaheuristics , 2009, GECCO '09.

[30]  Kalyanmoy Deb,et al.  A bilevel optimization approach to automated parameter tuning , 2014, GECCO.

[31]  Stefano Cagnoni,et al.  Analysis of evolutionary algorithms using multi-objective parameter tuning , 2014, GECCO.

[32]  Stefano Cagnoni,et al.  GPU-Based Automatic Configuration of Differential Evolution: A Case Study , 2013, EPIA.

[33]  R. Storn,et al.  Differential Evolution - A simple and efficient adaptive scheme for global optimization over continuous spaces , 2004 .

[34]  Robert E. Mercer,et al.  ADAPTIVE SEARCH USING A REPRODUCTIVE META‐PLAN , 1978 .

[35]  A. E. Eiben,et al.  Introduction to Evolutionary Computing , 2003, Natural Computing Series.

[36]  Janez Brest,et al.  A novel self-adaptive differential evolution for feature selection using threshold mechanism , 2018, 2018 IEEE Symposium Series on Computational Intelligence (SSCI).

[37]  Ana Madureira,et al.  Racing based approach for Metaheuristics parameter tuning , 2015, 2015 10th Iberian Conference on Information Systems and Technologies (CISTI).