Studying the Effect of Robustness Measures in Offline Parameter Tuning for Estimating the Performance of MOEA/D

Offline parameter tuning (OPT) of multi-objective evolutionary algorithms (MOEAs) has the goal of finding an appropriate set of parameters for solving a large number of problems. According to the no free lunch theorem (NFL), no algorithm can have the best performance in all classes of optimization problems. However, it is possible to find an appropriate set of parameters of an algorithm for solving a particular class of problems. For that sake, we need to study how to estimate the aggregation quality function for an algorithmic configuration assessed on a set of optimization problems. In this paper, we study robustness measures for dealing with the parameter settings of stochastic algorithms. We focus on decomposition-based MOEAs and we propose to tune scalarizing functions for solving some classes of problems based on the Pareto front shapes using up to 7 objective functions. Based on our experimental results, we were able to derive interesting guidelines to evaluate the quality of algorithmic configurations using a combination of descriptive statistics.

[1]  Hiroyuki Sato,et al.  Analysis of inverted PBI and comparison with other scalarizing functions in decomposition based MOEAs , 2015, J. Heuristics.

[2]  Andrzej P. Wierzbicki,et al.  The Use of Reference Objectives in Multiobjective Optimization , 1979 .

[3]  A. E. Eiben,et al.  Parameter Tuning of Evolutionary Algorithms: Generalist vs. Specialist , 2010, EvoApplications.

[4]  Guido Carpinelli,et al.  Exponential weighted method and a compromise programming method for multi-objective operation of plug-in vehicle aggregators in microgrids , 2014 .

[5]  Bernhard Sendhoff,et al.  Robust Optimization - A Comprehensive Survey , 2007 .

[6]  Hiroyuki Sato,et al.  Inverted PBI in MOEA/D and its impact on the search performance on multi and many-objective optimization , 2014, GECCO.

[7]  Adriana Menchaca-Mendez,et al.  A More Efficient Selection Scheme in iSMS-EMOA , 2014, IBERAMIA.

[8]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[9]  María Cristina Riff,et al.  A new algorithm for reducing metaheuristic design effort , 2013, 2013 IEEE Congress on Evolutionary Computation.

[10]  E. Hughes Multiple single objective Pareto sampling , 2003, The 2003 Congress on Evolutionary Computation, 2003. CEC '03..

[11]  Marco Laumanns,et al.  Scalable Test Problems for Evolutionary Multiobjective Optimization , 2005, Evolutionary Multiobjective Optimization.

[12]  Kaisa Miettinen,et al.  Nonlinear multiobjective optimization , 1998, International series in operations research and management science.

[13]  R. Lyndon While,et al.  A review of multiobjective test problems and a scalable test problem toolkit , 2006, IEEE Transactions on Evolutionary Computation.

[14]  M. Hansen,et al.  Evaluating the quality of approximations to the non-dominated set , 1998 .

[15]  W. Konen,et al.  Sequential Parameter Optimization in Noisy Environments , 2015 .

[16]  V. Bowman On the Relationship of the Tchebycheff Norm and the Efficient Frontier of Multiple-Criteria Objectives , 1976 .

[17]  Leslie Pérez Cáceres,et al.  The irace package: Iterated racing for automatic algorithm configuration , 2016 .

[18]  María Cristina Riff,et al.  Towards a Method for Automatic Algorithm Configuration: A Design Evaluation Using Tuners , 2014, PPSN.

[19]  Holger H. Hoos,et al.  Automated Algorithm Configuration and Parameter Tuning , 2012, Autonomous Search.

[20]  P. Papalambros,et al.  A NOTE ON WEIGHTED CRITERIA METHODS FOR COMPROMISE SOLUTIONS IN MULTI-OBJECTIVE OPTIMIZATION , 1996 .

[21]  A. E. Eiben,et al.  Parameter tuning for configuring and analyzing evolutionary algorithms , 2011, Swarm Evol. Comput..

[22]  Thomas Stützle,et al.  Automatic Algorithm Configuration Based on Local Search , 2007, AAAI.

[23]  Qingfu Zhang,et al.  MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition , 2007, IEEE Transactions on Evolutionary Computation.

[24]  Kalyanmoy Deb,et al.  A Multimodal Approach for Evolutionary Multi-objective Optimization (MEMO): Proof-of-Principle Results , 2015, EMO.

[25]  Jasbir S. Arora,et al.  Survey of multi-objective optimization methods for engineering , 2004 .

[26]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[27]  A. E. Eiben,et al.  Efficient relevance estimation and value calibration of evolutionary algorithm parameters , 2007, 2007 IEEE Congress on Evolutionary Computation.

[28]  Rubén Saborido,et al.  A preference-based evolutionary algorithm for multiobjective optimization: the weighting achievement scalarizing function genetic algorithm , 2015, J. Glob. Optim..

[29]  António Gaspar-Cunha,et al.  Robustness in multi-objective optimization using evolutionary algorithms , 2008, Comput. Optim. Appl..

[30]  Hisao Ishibuchi,et al.  Relation Between Weight Vectors and Solutions in MOEA/D , 2015, 2015 IEEE Symposium Series on Computational Intelligence.

[31]  Michael T. M. Emmerich,et al.  Test Problems Based on Lamé Superspheres , 2007, EMO.

[32]  Carlos A. Coello Coello,et al.  Solving Multiobjective Optimization Problems Using an Artificial Immune System , 2005, Genetic Programming and Evolvable Machines.

[33]  Lothar Thiele,et al.  Multiobjective Optimization Using Evolutionary Algorithms - A Comparative Case Study , 1998, PPSN.

[34]  Dick den Hertog,et al.  A practical guide to robust optimization , 2015, 1501.02634.

[35]  Eckart Zitzler,et al.  Evolutionary algorithms for multiobjective optimization: methods and applications , 1999 .