IOHprofiler is a new tool for analyzing and comparing iterative optimization heuristics. Given as input algorithms and problems written in C or Python, it provides as output a statistical evaluation of the algorithms' performance by means of the distribution on the fixed-target running time and the fixed-budget function values. In addition, IOHprofiler also allows to track the evolution of algorithm parameters, making our tool particularly useful for the analysis, comparison, and design of (self-)adaptive algorithms.
IOHprofiler is a ready-to-use software. It consists of two parts: an experimental part, which generates the running time data, and a post-processing part, which produces the summarizing comparisons and statistical evaluations. The experimental part is build on the COCO software, which has been adjusted to cope with optimization problems that are formulated as functions $f:\mathcal{S}^n \to \R$ with $\mathcal{S}$ being a discrete alphabet of integers. The post-processing part is our own work. It can be used as a stand-alone tool for the evaluation of running time data of arbitrary benchmark problems. It accepts as input files not only the output files of IOHprofiler, but also original COCO data files. The post-processing tool is designed for an interactive evaluation, allowing the user to chose the ranges and the precision of the displayed data according to his/her needs.
IOHprofiler is available on GitHub at \url{this https URL}.
[1]
Michael D. Vose,et al.
Unbiased black box search algorithms
,
2011,
GECCO '11.
[2]
Benjamin Doerr,et al.
Black-box complexities of combinatorial problems
,
2013,
Theor. Comput. Sci..
[3]
Anne Auger,et al.
Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions
,
2009
.
[4]
Anne Auger,et al.
COCO: a platform for comparing continuous optimizers in a black-box setting
,
2016,
Optim. Methods Softw..