## Compilation the package

If you are using the tool for the first time, please download or clone this branch and run make at the root directory of the project. After running make to compile,

• Object files will be generated in build/Cpp/obj
• Three exectuable files will be generated in build/Cpp/bin

## Use cases

Afterward completing the initial compilation, you can use the folder build/Cpp and use the Makefile therein for your experiments. The remainder of this tutorial assumes the working directory is set to the build/Cpp folder.

## Using individual test problems

To use IOHexperimenter to run benchmarking on a specific problem, the template-file IOHprofiler_run_problem.cpp is provided. Since all problems within the IOHexperimenter are defined as specific derived class inheriting problem IOHprofiler_problem class, it is quite straightforward to use them.

An example testing evolutionary algorithm with mutation operator on OneMax is implemented in IOHprofiler_run_problem.cpp. To use a different function, modify the include-statement to include the problem to use, and use the corresponding class-name instead of OneMax.

For this example, a OneMax class is declared and initialized with dimension 1000 on the default instance of the probelem.

OneMax om;
int dimension = 1000;
om.Initilize_problem(dimension);


During the optimization process, the algorithm can acquire the fitness value through evaluate() function. In the example below, om.evaluate(x) returns the fitness of x. Another option is the statement om.evaluate(x,y), which stores the fitness of x in y. logger is an IOHprofiler_csv_logger class, which stores function evaluations in a format compatible with IOHanalyzer. logger.write_line(om.loggerInfo()) deliveries the lastest information of tested om to the logger. In addition, om.IOHprofiler_hit_optimal() is an indicator you can use to check if the optimum has been found.

while (!om.IOHprofiler_hit_optimal()) {
x = x_star;
if (mutation(x, mutation_rate)) {
y = om.evaluate(x);
logger.write_line(om.loggerInfo());
}
if (y[0] > best_value) {
best_value = y;
x_star = x;
}
}


If, for your experiment, you want to generate data to be used in the IOHanalyzer, a IOHprofiler_csv_logger should be added to the problem you are testing on. The arguments of IOHprofiler_csv_logger are directory of result folder, name of result folder, name of the algorithm and infomation of the algorithm. With different setting of triggers (observer), mutilple data files are to be generated for each experiment. More details on the available triggers are available here. Before optimizing a problem, logger must be targeted with the problem using the statement logger.target_problem(), with which arguments are problem id, dimension, instance id, problem name, and the type of optimization (maximization or minimization).

std::vector<int> time_points{1,2,5};
std::shared_ptr<IOHprofiler_csv_logger> logger(new IOHprofiler_csv_logger("./","run_problem","EA","EA"));
logger->set_complete_flag(true);
logger->set_interval(0);
logger->set_time_points(time_points,10);
logger->activate_logger();
logger.target_problem(om.IOHprofiler_get_problem_id(),
om.IOHprofiler_get_number_of_variables(),
om.IOHprofiler_get_instance_id(),
om.IOHprofiler_get_problem_name(),
om.IOHprofiler_get_optimization_type());


## Using pre-installed Benchmark Suites

Suites are collections of test problems. The idea behind a suite is that packing problems with similar properties toghther makes it easier to test algorithms on a class of problems. Currently, two pre-defined suites are available: PBO, consisting of 23 pseudo Boolean problems, and BBOB, consisting of 24 real-valued problems. To find out how to create your own suites, please visit this page.

An example of testing an evolutionary algorithm with mutation operator on the PBO suite is implemented in IOHprofiler_run_suite.cpp. PBO suite includes pointers to 23 problems. To instantiate problems you want to test, the vectors of problem id, instances and dimensions need to be given as follows:

std::vector<int> problem_id = {1,2};
std::vector<int> instance_id ={1,2};
std::vector<int> dimension = {100,200,300};
PBO_suite pbo(problem_id,instance_id,dimension);


With the suite, you can test problems of the suite one by one, until all problems have been tested. In this example, the order of problem is as follow, and an evlutionary_algorithm is applied:

1. problem id 1, instance 1, dimension 100
2. problem id 1, instance 2, dimension 100
3. problem id 1, instance 1, dimension 200
4. problem id 1, instance 2, dimension 200
5. problem id 1, instance 1, dimension 300
6. problem id 1, instance 2, dimension 300
7. problem id 2, instance 1, dimension 100
8. problem id 2, instance 2, dimension 100
9. problem id 2, instance 1, dimension 200
10. problem id 2, instance 2, dimension 200
11. problem id 2, instance 1, dimension 300
12. problem id 2, instance 2, dimension 300
while (problem = pbo.get_next_problem()) {
evolutionary_algorithm(problem);
}


If, for your experiment, you want to generate data to be used in the IOHanalyzer, a IOHprofiler_csv_logger should be added to the suite. The arguments of IOHprofiler_csv_logger are the directory of result folder, name of result folder, name of the algorithm and infomation of the algorithm. In addition, you can set up mutilple triggers of recording evaluations. For the details of triggers, please visit the introduction of IOHprofiler_observer.

std::vector<int> time_points{1,2,5};
std::shared_ptr<IOHprofiler_csv_logger> logger(new IOHprofiler_csv_logger("./","run_suite","EA","EA"));
logger->set_complete_flag(true);
logger->set_interval(2);
logger->set_time_points(time_points,3);
logger->activate_logger();
logger->target_suite(pbo.IOHprofiler_suite_get_suite_name());


## Conducting experiments with a configuration file

By using the provided IOHprofiler_experiment class, you can use a configuration file to configure both the suite and the logger for csv files.

To use the provided experiment structure, you need to provide both the path to the configuration file and the pointer to your optimization algorithm to the experimenter._run() function, which will execute all tasks of the experiment.

In addition, you can set the number of repetitions for all problems by using experimenter._set_independent_runs(2).

std::string configName = "./configuration.ini";
IOHprofiler_experimenter<int> experimenter(configName,evolutionary_algorithm);
experimenter._set_independent_runs(2);
experimenter._run();


For the content of configuration file, it consists of three sections:

suite configures the problems to be tested.

• suite_name, is the name of the suite to be used. Please make sure that the suite with the corresponding name is registered.
• problem_id, configures problems to be tested. Note that id of problems are configured by the suite, please make sure that id is within the valid range.
• instance_id, configures the transformation methods applied on problems. For PBO:
• 1 : no transformer operations on the problem.
• 2-50 : XOR and SHIFT operations are applied on the problem.
• 51-100: SIGMA and SHIFT operations are applied on the problem.
• dimension, configures dimension of problems to be tested. Note that allowed dimension is not larger than 20000.

logger configures the setting of output csv files.

• result_foler is the directory of the folder where sotres output files.
• algorithm_name, is the name of the algorithm, which is used when generating ‘.info’ files.
• algorithm_info, is additional information of the algorithm, which is used when generating ‘.info’ files.

observer configures parameters of IOHprofiler_server, which is used in IOHprofiler_csv_logger,

• complete_triggers is the switch of .cdat files, which works with complete tracking strategy. Set it as TRUE or true if you want to output .cdat files.
• update_triggers is the switch of .dat files, which works with target-based strategy strategy. Set it as TRUE or true if you want to output .dat files.
• number_interval_triggers configures the .idat files, which works with interval tracking number_target_triggers sets the value of the frequecny. If you do not want to generate .idat files, set number_target_triggers as 0.
• number_target_triggers configures the .tdat files, which works with time-based tracking strategy.
• base_evaluation_triggers configures the .tdat files, which works with time-based tracking strategy. To switch off .tdat files, set both number_target_triggers and base_evaluation_triggers as 0.

## Useful functions

IOHprofiler_problem and IOHprofiler_suite provide public member functions so that the optimizer can acquire useful information during optimization process.

A list of useful member functions of IOHprofiler_problem is below:

• evaluate(x), returns a fitness values. The argument x is a vector of variables.
• evaluate(x,y), updates y with a fitness values, and x is a vector of variables.
• reset_problem(), reset the history information of problem evaluations. You should call this function at first when you plan to do another test on the same problem class.
• IOHprofiler_hit_optimal(), returns true if the optimum of the problem has been found.
• IOHprofiler_get_number_of_variables(number_of_variables), returns dimension of the problem.
• IOHprofiler_get_evaluations(), returns the number of function evaluations that has been used.
• loggerInfo, returns a vector of information of function evaluations, which consists of evaluations, current found raw objective, best so far found raw objective, current found transformed objective, and best of far best found transformed objective.
• loggerCOCOInfo, returns a vector of information of function evaluations, which consists of evaluations, precision of current found objective, best so far found precision, current found objective, and best so far found objective.
• IOHprofiler_get_problem_id()
• IOHprofiler_get_instance_id()
• IOHprofiler_get_problem_name()
• IOHprofiler_get_problem_type()
• IOHprofiler_get_lowerbound()
• IOHprofiler_get_upperbound()
• IOHprofiler_get_number_of_objectives()
• IOHprofiler_get_optimization_type()

A list of useful member functions of IOHprofiler_suite` is below:

• get_next_problem, return a shared point of problems of the suite in order.
• get_current_problem(), returns current problem and reset it.
• get_problem(problem_name,instance,dimension), returns the specific problem.
• IOHprofiler_suite_get_number_of_problems
• IOHprofiler_suite_get_number_of_instances
• IOHprofiler_suite_get_number_of_dimensions
• IOHprofiler_suite_get_problem_id
• IOHprofiler_suite_get_instance_id()
• IOHprofiler_suite_get_dimension()
• IOHprofiler_suite_get_suite_name()