Profiling Iterative Optimization Heuristics

IOHprofiler, a benchmarking platform for evaluating the performance of iterative optimization heuristics (IOHs), e.g., Evolutionary Algorithms and Swarm-based Algorithms. We aim to the integrate various elements of the entire benchmarking pipeline, ranging from problem (instance) generators and modular algorithm frameworks over automated algorithm configuration techniques and feature extraction methods to the actual experimentation, data analysis, and visualization. It consists of the following major components:

  • IOHexperimenter for generating benchmarking suites, which produce experiment data,
  • IOHanalyzer for the statistical analysis and visualization of the experiment data,
  • IOHproblem for providing a collection of test functions.
  • IOHdata for hosting the benchmarking data sets from IOHexperimenter as well as other platforms, e.g., BBOB/COCO and Nevergrad.
  • IOHalgorithm for efficient implemention of various classic optimization algorithms.

The composition of IOHprofiler and the coordinations of its components are depicted below:

IOHexperimenter provides,

  • A generic framework to generate benchmarking suite for the optimization task you’re insterested in.
  • A Pseudo-Boolean Optimization (PBO) benchmark suite, containing 25 test problems of the kind $f\colon \{0,1\}^d \rightarrow \mathbb{R}$.
  • The integration of 24 Black-Box Optimization Benchmarking (BBOB) functions on the continuous domain, namely $f\colon \mathbb{R}^d \rightarrow \mathbb{R}$.

IOHanalyzer provides:

  • Performance analysis in both a fixed-target and fixed-budget perspective
  • A web-based interface to interactively analyze and visualize the empirical performance of IOHs.
  • Statistical evaluation of algorithm performance.
  • R programming interfaces in the backend for even more customizable analysis.


In addition to this wiki page, there are a number of (more) detailed documentations to explore: