Benchmarking discrete optimization heuristics with IOHprofiler

Abstract Automated benchmarking environments aim to support researchers in understanding how different algorithms perform on different types of optimization problems. Such comparisons provide insights into the strengths and weaknesses of different approaches, which can be leveraged into designing new algorithms and into the automation of algorithm selection and configuration. With the ultimate goal to create a meaningful benchmark set for iterative optimization heuristics, we have recently released IOHprofiler, a software built to create detailed performance comparisons between iterative optimization heuristics. With this present work we demonstrate that IOHprofiler provides a suitable environment for automated benchmarking. We compile and assess a selection of 23 discrete optimization problems that subscribe to different types of fitness landscapes. For each selected problem we compare performances of twelve different heuristics, which are as of now available as baseline algorithms in IOHprofiler. We also provide a new module for IOHprofiler which extents the fixed-target and fixed-budget results for the individual problems by ECDF results, which allows one to derive aggregated performance statistics for groups of problems.

[1]  Szymon Wasik,et al.  Optil.io: Cloud Based Platform For Solving Optimization Problems Using Crowdsourcing Approach , 2016, CSCW '16 Companion.

[2]  Thomas Weise,et al.  Difficult features of combinatorial optimization problems and the tunable w-model benchmark problem for simulating them , 2018, GECCO.

[3]  Kurt Mehlhorn,et al.  The Query Complexity of a Permutation-Based Variant of Mastermind , 2019, Discret. Appl. Math..

[4]  Carlos M. Fonseca,et al.  On the Computation of the Empirical Attainment Function , 2011, EMO.

[5]  Benjamin Doerr,et al.  Optimal Static and Self-Adjusting Parameter Choices for the (1+(λ,λ))\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$( , 2017, Algorithmica.

[6]  Thomas Bäck,et al.  Intelligent Mutation Rate Control in Canonical Genetic Algorithms , 1996, ISMIS.

[7]  Dirk Thierens On benchmark properties for adaptive operator selection , 2009, GECCO '09.

[8]  Heinz Mühlenbein,et al.  The Equation for Response to Selection and Its Use for Prediction , 1997, Evolutionary Computation.

[9]  Benjamin Doerr,et al.  Better Runtime Guarantees via Stochastic Domination , 2018, EvoCOP.

[10]  Stephan Mertens,et al.  Low autocorrelation binary sequences , 2015, 1512.02475.

[11]  Anne Auger,et al.  COCO: The Bi-objective Black Box Optimization Benchmarking (bbob-biobj) Test Suite , 2016, ArXiv.

[12]  Marco Zaffalon,et al.  Time for a change: a tutorial for comparing multiple classifiers through Bayesian analysis , 2016, J. Mach. Learn. Res..

[13]  Benjamin Doerr,et al.  From black-box complexity to designing new genetic algorithms , 2015, Theor. Comput. Sci..

[14]  Thomas Jansen,et al.  On the analysis of the (1+1) evolutionary algorithm , 2002, Theor. Comput. Sci..

[15]  Sébastien Vérel,et al.  Multifractality and dimensional determinism in local optima networks , 2018, GECCO.

[16]  Alessandro Panconesi,et al.  Concentration of Measure for the Analysis of Randomized Algorithms , 2009 .

[17]  Vincent A. Mellor,et al.  Numerical Simulations of the Ising Model on the Union Jack Lattice , 2010, 1106.5579.

[18]  El-Ghazali Talbi,et al.  ParadisEO: A Framework for the Reusable Design of Parallel and Distributed Metaheuristics , 2004, J. Heuristics.

[19]  Thomas Bäck,et al.  Evolutionary Algorithms in Theory and Practice , 1996 .

[20]  Marc Schoenauer,et al.  Rigorous Hitting Times for Binary Mutations , 1999, Evolutionary Computation.

[21]  Angelika Steger,et al.  The linear hidden subset problem for the (1 + 1) EA with scheduled and adaptive mutation rates , 2018, GECCO.

[22]  Ofer M. Shir,et al.  Compiling a benchmarking test-suite for combinatorial black-box optimization: a position paper , 2018, GECCO.

[23]  Nikolaus Hansen,et al.  A practical guide to experimentation , 2018, GECCO.

[24]  Bernd Bischl,et al.  Exploratory landscape analysis , 2011, GECCO '11.

[25]  Per Kristian Lehre,et al.  Faster black-box algorithms through higher arity operators , 2010, FOGA '11.

[26]  Arina Buzdalova,et al.  Offspring population size matters when comparing evolutionary algorithms with self-adjusting mutation rates , 2019, GECCO.

[27]  Anne Auger,et al.  COCO: a platform for comparing continuous optimizers in a black-box setting , 2016, Optim. Methods Softw..

[28]  Benjamin Doerr,et al.  Unknown solution length problems with no asymptotically optimal run time , 2017, GECCO.

[29]  Andrew Lucas,et al.  Ising formulations of many NP problems , 2013, Front. Physics.

[30]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[31]  Hao Wang,et al.  Evolving the structure of Evolution Strategies , 2016, 2016 IEEE Symposium Series on Computational Intelligence (SSCI).

[32]  Heike Trautmann,et al.  Automated Algorithm Selection: Survey and Perspectives , 2018, Evolutionary Computation.

[33]  Benjamin Doerr,et al.  Fast genetic algorithms , 2017, GECCO.

[34]  Benjamin Doerr,et al.  The (1+λ) evolutionary algorithm with self-adjusting mutation rate , 2017, GECCO.

[35]  Sébastien Vérel,et al.  On the Fractal Nature of Local Optima Networks , 2018, EvoCOP.

[36]  José Antonio Lozano,et al.  Bayesian inference for algorithm ranking analysis , 2018, GECCO.

[37]  Carola Doerr,et al.  Hyper-parameter tuning for the (1 + (λ, λ)) GA , 2019, GECCO.

[38]  Ofer M. Shir,et al.  Benchmarking discrete optimization heuristics with IOHprofiler , 2019, GECCO.

[39]  Carola Doerr,et al.  Complexity Theory for Discrete Black-Box Optimization Heuristics , 2018, Theory of Evolutionary Computation.

[40]  Stefan M. Wild,et al.  Benchmarking Derivative-Free Optimization Algorithms , 2009, SIAM J. Optim..

[41]  Benjamin Doerr,et al.  Optimal Parameter Choices via Precise Black-Box Analysis , 2016, GECCO.

[42]  Carsten Witt,et al.  Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions† , 2013, Combinatorics, Probability and Computing.

[43]  Ofer M. Shir,et al.  Bayesian performance analysis for black-box optimization benchmarking , 2019, GECCO.

[44]  Dimo Brockhoff,et al.  Mixed-integer benchmark problems for single- and bi-objective optimization , 2019, GECCO.

[45]  Dieter Beule,et al.  Evolutionary search for low autocorrelated binary sequences , 1998, IEEE Trans. Evol. Comput..

[46]  Ofer M. Shir,et al.  Predict or screen your expensive assay: DoE vs. surrogates in experimental combinatorial optimization , 2019, GECCO.

[47]  R. P. Ingalls,et al.  FOURTH TEST OF GENERAL RELATIVITY: PRELIMINARY RESULTS. , 1968 .

[48]  F. Barahona On the computational complexity of Ising spin glass models , 1982 .

[49]  Antonio J. Nebro,et al.  jMetal: A Java framework for multi-objective optimization , 2011, Adv. Eng. Softw..

[50]  Carsten Witt,et al.  Theory of estimation-of-distribution algorithms , 2018, GECCO.

[51]  I. A. Pasha,et al.  Bi-alphabetic pulse compression radar signal design , 2000 .

[52]  Hao Wang,et al.  IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics , 2018, ArXiv.

[53]  S. Kauffman,et al.  Towards a general theory of adaptive walks on rugged landscapes. , 1987, Journal of theoretical biology.

[54]  Thomas Bäck,et al.  An evolutionary heuristic for the maximum independent set problem , 1994, Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence.

[55]  Thomas Bäck,et al.  Interpolating Local and Global Search by Controlling the Variance of Standard Bit Mutation , 2019, 2019 IEEE Congress on Evolutionary Computation (CEC).

[56]  Carola Doerr,et al.  The $$(1+1)$$(1+1) Elitist Black-Box Complexity of LeadingOnes , 2017, Algorithmica.

[57]  Brett Stevens,et al.  A survey of known results and research areas for n-queens , 2009, Discret. Math..

[58]  Anne Auger,et al.  Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009 , 2010, GECCO '10.

[59]  H. Mühlenbein,et al.  From Recombination of Genes to the Estimation of Distributions I. Binary Parameters , 1996, PPSN.

[60]  Hao Wang,et al.  Towards a theory-guided benchmarking suite for discrete black-box optimization heuristics: profiling (1 + λ) EA variants on onemax and leadingones , 2018, GECCO.

[61]  Anne Auger,et al.  COCO: Performance Assessment , 2016, ArXiv.

[62]  Marc Schoenauer,et al.  Per instance algorithm configuration of CMA-ES with limited budget , 2017, GECCO.

[63]  Per Kristian Lehre,et al.  Black-Box Search by Unbiased Variation , 2010, GECCO '10.

[64]  Heike Trautmann,et al.  The R-Package FLACCO for exploratory landscape analysis with applications to multi-objective optimization problems , 2016, 2016 IEEE Congress on Evolutionary Computation (CEC).

[65]  Ingo Wegener,et al.  The one-dimensional Ising model: Mutation versus recombination , 2005, Theor. Comput. Sci..