MOTA: A Many-Objective Tuning Algorithm Specialized for Tuning under Multiple Objective Function Evaluation Budgets

Control parameter studies assist practitioners to select optimization algorithm parameter values that are appropriate for the problem at hand. Parameter values are well suited to a problem if they result in a search that is effective given that problem’s objective function(s), constraints, and termination criteria. Given these considerations a many-objective tuning algorithm named MOTA is presented. MOTA is specialized for tuning a stochastic optimization algorithm according to multiple performance measures, each over a range of objective function evaluation budgets. MOTA’s specialization consists of four aspects: (1) a tuning problem formulation that consists of both a speed objective and a speed decision variable; (2) a control parameter tuple assessment procedure that utilizes information from a single assessment run’s history to gauge that tuple’s performance at multiple evaluation budgets; (3) a preemptively terminating resampling strategy for handling the noise present when tuning stochastic algorithms; and (4) the use of bi-objective decomposition to assist in many-objective optimization. MOTA combines these aspects together with differential evolution operators to search for effective control parameter values. Numerical experiments consisting of tuning NSGA-II and MOEA/D demonstrate that MOTA is effective at many-objective tuning.

[1]  Jing J. Liang,et al.  Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization , 2005 .

[2]  John Fulcher,et al.  Computational Intelligence: An Introduction , 2008, Computational Intelligence: A Compendium.

[3]  Marco Laumanns,et al.  SPEA2: Improving the strength pareto evolutionary algorithm , 2001 .

[4]  Qingfu Zhang,et al.  Multiobjective optimization Test Instances for the CEC 2009 Special Session and Competition , 2009 .

[5]  John J. Grefenstette,et al.  Optimization of Control Parameters for Genetic Algorithms , 1986, IEEE Transactions on Systems, Man, and Cybernetics.

[6]  A. E. Eiben,et al.  An MOEA-based Method to Tune EA Parameters on Multiple Objective Functions , 2010, IJCCI.

[7]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[8]  Russell C. Eberhart,et al.  A new optimizer using particle swarm theory , 1995, MHS'95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science.

[9]  Thomas Bartz-Beielstein,et al.  Sequential parameter optimization , 2005, 2005 IEEE Congress on Evolutionary Computation.

[10]  R. Lyndon While,et al.  A review of multiobjective test problems and a scalable test problem toolkit , 2006, IEEE Transactions on Evolutionary Computation.

[11]  Eckart Zitzler,et al.  Objective Reduction in Evolutionary Multiobjective Optimization: Theory and Applications , 2009, Evolutionary Computation.

[12]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[13]  Lothar Thiele,et al.  Comparison of Multiobjective Evolutionary Algorithms: Empirical Results , 2000, Evolutionary Computation.

[14]  Johann Dréo,et al.  Using performance fronts for parameter setting of stochastic metaheuristics , 2009, GECCO '09.

[15]  David J. Groggel,et al.  Practical Nonparametric Statistics , 2000, Technometrics.

[16]  Chao Hu,et al.  A comparative study of probability estimation methods for reliability analysis , 2012 .

[17]  Jürgen Teich,et al.  Quad-trees: A Data Structure for Storing Pareto Sets in Multiobjective Evolutionary Algorithms with Elitism , 2005, Evolutionary Multiobjective Optimization.

[18]  Schalk Kok,et al.  The sensitivity of multi-objective optimization algorithm performance to objective function evaluation budgets , 2013, 2013 IEEE Congress on Evolutionary Computation.

[19]  A. E. Eiben,et al.  Parameter Tuning of Evolutionary Algorithms: Generalist vs. Specialist , 2010, EvoApplications.

[20]  Olivier François,et al.  Design of evolutionary algorithms-A statistical perspective , 2001, IEEE Trans. Evol. Comput..

[21]  Qingfu Zhang,et al.  MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition , 2007, IEEE Transactions on Evolutionary Computation.

[22]  A. E. Eiben,et al.  Efficient relevance estimation and value calibration of evolutionary algorithm parameters , 2007, 2007 IEEE Congress on Evolutionary Computation.

[23]  Kai Keng Ang,et al.  A synergy of econometrics and computational methods (GARCH-RNFS) for volatility forecasting , 2010, IEEE Congress on Evolutionary Computation.

[24]  A. E. Eiben,et al.  Parameter tuning for configuring and analyzing evolutionary algorithms , 2011, Swarm Evol. Comput..

[25]  A. E. Eiben,et al.  Comparing parameter tuning methods for evolutionary algorithms , 2009, 2009 IEEE Congress on Evolutionary Computation.

[26]  Andries Petrus Engelbrecht,et al.  Tuning Optimization Algorithms Under Multiple Objective Function Evaluation Budgets , 2015, IEEE Transactions on Evolutionary Computation.

[27]  Zbigniew Michalewicz,et al.  Parameter Control in Evolutionary Algorithms , 2007, Parameter Setting in Evolutionary Algorithms.

[28]  Soon-Thiam Khu,et al.  An Investigation on Preference Order Ranking Scheme for Multiobjective Evolutionary Optimization , 2007, IEEE Transactions on Evolutionary Computation.

[29]  Nicola Beume,et al.  SMS-EMOA: Multiobjective selection based on dominated hypervolume , 2007, Eur. J. Oper. Res..

[30]  Marco Laumanns,et al.  Performance assessment of multiobjective optimizers: an analysis and review , 2003, IEEE Trans. Evol. Comput..

[31]  Thomas Stützle,et al.  Improvement Strategies for the F-Race Algorithm: Sampling Design and Iterative Refinement , 2007, Hybrid Metaheuristics.

[32]  Kalyanmoy Deb,et al.  Simulated Binary Crossover for Continuous Search Space , 1995, Complex Syst..

[33]  Jürgen Branke,et al.  Meta-optimization for parameter tuning with a flexible computing budget , 2012, GECCO '12.

[34]  P. N. Suganthan,et al.  Differential Evolution: A Survey of the State-of-the-Art , 2011, IEEE Transactions on Evolutionary Computation.

[35]  Antonio J. Nebro,et al.  jMetal: A Java framework for multi-objective optimization , 2011, Adv. Eng. Softw..

[36]  R. A. Groeneveld,et al.  Practical Nonparametric Statistics (2nd ed). , 1981 .

[37]  A. E. Eiben,et al.  Beating the ‘world champion’ evolutionary algorithm via REVAC tuning , 2010, IEEE Congress on Evolutionary Computation.

[38]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[39]  Maurice Clerc,et al.  The particle swarm - explosion, stability, and convergence in a multidimensional complex space , 2002, IEEE Trans. Evol. Comput..

[40]  Stefano Cagnoni,et al.  Analysis of evolutionary algorithms using multi-objective parameter tuning , 2014, GECCO.

[41]  Shahryar Rahnamayan,et al.  Micro-differential evolution with vectorized random mutation factor , 2014, 2014 IEEE Congress on Evolutionary Computation (CEC).

[42]  Marco Laumanns,et al.  Scalable Test Problems for Evolutionary Multiobjective Optimization , 2005, Evolutionary Multiobjective Optimization.

[43]  Hussein A. Abbass,et al.  Localization for Solving Noisy Multi-Objective Optimization Problems , 2009, Evolutionary Computation.

[44]  Qingfu Zhang,et al.  Objective Reduction in Many-Objective Optimization: Linear and Nonlinear Algorithms , 2013, IEEE Transactions on Evolutionary Computation.

[45]  Eckart Zitzler,et al.  Indicator-Based Selection in Multiobjective Search , 2004, PPSN.

[46]  Aravind Srinivasan,et al.  Innovization: innovating design principles through optimization , 2006, GECCO.

[47]  H. Beyer Evolutionary algorithms in noisy environments : theoretical issues and guidelines for practice , 2000 .