Profiling Iterative Optimization Heuristics

IOHprofiler is a tool for benchmarking and analyzing and iterative optimization heuristics (IOHs), e.g., Evolutionary Algorithms and Swarm-based Algorithms. It consists of two parts:

  • IOHexperimenter for generating benchmarking suites, which produce experiment data, and
  • IOHanalyzer for the statistical analysis and visualization of the experiment data.

The common workflow of benchmaking and empirical analysis is shown as follows:

IOHexperimenter provides,

  • a generic framework to generate benchmarking suite for the optimization task you’re insterested in,
  • a Pseudo-Boolean Optimization (PBO) benchmark suite, containing 23 test problems of the kind $f\colon \{0,1\}^d \rightarrow \mathbb{R}$, and
  • the integration of 24 Black-Box Optimization Benchmarking (BBOB) functions on the continuous domain, namely $f\colon \mathbb{R}^d \rightarrow \mathbb{R}$.

IOHanalyzer provides:

  • a web-based interface to analyze and visualize the empirical performance of IOHs,
  • interactive plot,
  • statistical evaluation,
  • report generation, and
  • R programming interfaces in the backend.


We thank our colleagues Anne Auger, Dimo Brockhoff, Arina Buzdalova, Maxim Buzdalov, Johann Dréo, Nikolaus Hansen, Pietro S. Oliveto, Ofer Shir, Markus Wagner, and Thomas Weise for various discussions around the benchmarking of iterative optimization heuristics.

Parts of our work have been inspired by working group 3 of COST Action CA15140 ‘Improving Applicability of Nature-Inspired Optimisation by Joining Theory and Practice (ImAppNIO)’ supported by the European Cooperation in Science and Technology.

Our work has been supported by a public grant as part of the Investissement d’avenir project, reference ANR-11-LABX-0056-LMH, LabEx LMH, in a joint call with the Gaspard Monge Program for optimization, operations research, and their interactions with data sciences.

Furong Ye acknowledges financial support from the China Scholarship Council, CSC No. 201706310143.