IOHanalyzer: Performance Analysis for Iterative Optimization Heuristic

We propose IOHanalyzer, a new software for analyzing the empirical performance of iterative optimization heuristics (IOHs) such as local search algorithms, genetic and evolutionary algorithms, Bayesian optimization algorithms, and similar optimizers. Implemented in R and C++, IOHanalyzer is fully open source. It is available on CRAN and GitHub. IOHanalyzer provides a platform for analyzing and visualizing the performance of IOHs on real-valued, single-objective optimization tasks. It provides detailed statistics about the fixed-target running times and fixed-budget results of the benchmarked algorithms. Performance aggregation over several benchmark problems is also possible, for example in the form of empirical cumulative distribution functions. A key advantage of IOHanalyzer over other performance analysis packages is its highly interactive design, which allows the user to specify the performance measures, ranges, and granularity that are most useful for her experiments. It is designed to analyze not only performance traces, but also the evolution of dynamic state parameters. IOHanalyzer can directly process performance data from the main benchmarking platforms, including the COCO platform, Nevergrad, and our own IOHexperimenter. An R programming interface is provided for users preferring to have a finer control over the implemented functionalities.

[1]  Ofer M. Shir,et al.  Bayesian performance analysis for black-box optimization benchmarking , 2019, GECCO.

[2]  Anne Auger,et al.  Convergence results for the (1, lambda)-SA-ES using the theory of phi-irreducible Markov chains , 2005, Theor. Comput. Sci..

[3]  Jeffrey L. Popyack Erratum to: Gusz Eiben and Jim Smith: Introduction to evolutionary computing (second edition) , 2016, Genetic Programming and Evolvable Machines.

[4]  Dimo Brockhoff,et al.  Mixed-integer benchmark problems for single- and bi-objective optimization , 2019, GECCO.

[5]  Olivier Teytaud,et al.  Versatile black-box optimization , 2020, GECCO.

[6]  Tome Eftimov,et al.  A Novel Approach to statistical comparison of meta-heuristic stochastic optimization algorithms using deep statistics , 2017, Inf. Sci..

[7]  Hao Wang,et al.  A new acquisition function for Bayesian optimization based on the moment-generating function , 2017, 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[8]  Raymond Ros,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Experimental Setup , 2009 .

[9]  Hao Wang,et al.  Towards a theory-guided benchmarking suite for discrete black-box optimization heuristics: profiling (1 + λ) EA variants on onemax and leadingones , 2018, GECCO.

[10]  Anne Auger,et al.  COCO: Performance Assessment , 2016, ArXiv.

[11]  Robert Tibshirani,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition , 2001, Springer Series in Statistics.

[12]  A. Auger Convergence results for the ( 1 , )-SA-ES using the theory of-irreducible Markov chains , 2005 .

[13]  Anne Auger,et al.  Theory of Randomized Search Heuristics , 2012, Algorithmica.

[14]  Thomas Bäck,et al.  Interpolating Local and Global Search by Controlling the Variance of Standard Bit Mutation , 2019, 2019 IEEE Congress on Evolutionary Computation (CEC).

[15]  El-Ghazali Talbi,et al.  ParadisEO: A Framework for the Reusable Design of Parallel and Distributed Metaheuristics , 2004, J. Heuristics.

[16]  Stefan M. Wild,et al.  Benchmarking Derivative-Free Optimization Algorithms , 2009, SIAM J. Optim..

[17]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[18]  Yuval Tassa,et al.  MuJoCo: A physics engine for model-based control , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[20]  Heike Trautmann,et al.  The R-Package FLACCO for exploratory landscape analysis with applications to multi-objective optimization problems , 2016, 2016 IEEE Congress on Evolutionary Computation (CEC).

[21]  D. Freedman,et al.  On the histogram as a density estimator:L2 theory , 1981 .

[22]  Leslie Pérez Cáceres,et al.  The irace package: Iterated racing for automatic algorithm configuration , 2016 .

[23]  Stephan Mertens,et al.  Low autocorrelation binary sequences , 2015, 1512.02475.

[24]  Anne Auger,et al.  COCO: The Large Scale Black-Box Optimization Benchmarking (bbob-largescale) Test Suite , 2019, ArXiv.

[25]  Hao Wang,et al.  Evolving the structure of Evolution Strategies , 2016, 2016 IEEE Symposium Series on Computational Intelligence (SSCI).

[26]  L. Schmetterer Zeitschrift fur Wahrscheinlichkeitstheorie und Verwandte Gebiete. , 1963 .

[27]  Ofer M. Shir,et al.  Predict or screen your expensive assay: DoE vs. surrogates in experimental combinatorial optimization , 2019, GECCO.

[28]  Anne Auger,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions , 2009 .

[29]  Anne Auger,et al.  COCO: a platform for comparing continuous optimizers in a black-box setting , 2016, Optim. Methods Softw..

[30]  José Antonio Lozano,et al.  Bayesian inference for algorithm ranking analysis , 2018, GECCO.

[31]  Fabio Caraffini The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms , 2020, Mathematics.

[32]  Ofer M. Shir,et al.  Benchmarking discrete optimization heuristics with IOHprofiler , 2019, GECCO.

[33]  Donald R. Jones,et al.  A Taxonomy of Global Optimization Methods Based on Response Surfaces , 2001, J. Glob. Optim..

[34]  Anne Auger,et al.  COCO: The Bi-objective Black Box Optimization Benchmarking (bbob-biobj) Test Suite , 2016, ArXiv.

[35]  Maarten Keijzer,et al.  Evolving Objects: A General Purpose Evolutionary Computation Library , 2001, Artificial Evolution.

[36]  Xiaodong Li,et al.  Benchmark Functions for the CEC'2010 Special Session and Competition on Large-Scale , 2009 .

[37]  Frank Neumann,et al.  Bioinspired computation in combinatorial optimization: algorithms and their computational complexity , 2010, GECCO '12.

[38]  Benjamin Doerr,et al.  Theory of Parameter Control for Discrete Black-Box Optimization: Provable Performance Gains Through Dynamic Parameter Choices , 2018, Theory of Evolutionary Computation.

[39]  Gerhard W. Dueck,et al.  Threshold accepting: a general purpose optimization algorithm appearing superior to simulated anneal , 1990 .

[40]  Thomas Bäck,et al.  Theory of Evolutionary Computation: Recent Developments in Discrete Optimization , 2020, Theory of Evolutionary Computation.

[41]  Carlos M. Fonseca,et al.  On the Computation of the Empirical Attainment Function , 2011, EMO.

[42]  Thomas Bäck,et al.  A modular hybridization of particle swarm optimization and differential evolution , 2020, GECCO Companion.

[43]  Philip H. Ramsey Nonparametric Statistical Methods , 1974, Technometrics.

[44]  Hao Wang,et al.  IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics , 2018, ArXiv.

[45]  Anne Auger,et al.  Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009 , 2010, GECCO '10.

[46]  Pedro Larrañaga,et al.  Estimation of Distribution Algorithms , 2002, Genetic Algorithms and Evolutionary Computation.

[47]  Ameet Talwalkar,et al.  Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization , 2016, J. Mach. Learn. Res..

[48]  Thomas Weise,et al.  Difficult features of combinatorial optimization problems and the tunable w-model benchmark problem for simulating them , 2018, GECCO.

[49]  Manuel López-Ibáñez,et al.  Ant colony optimization , 2010, GECCO '10.

[50]  Mark Hoogendoorn,et al.  Parameter Control in Evolutionary Algorithms: Trends and Challenges , 2015, IEEE Transactions on Evolutionary Computation.