Mirrored variants of the (1,2)-CMA-ES compared on the noisy BBOB-2010 testbed

Derandomization by means of mirrored samples has been recently introduced to enhance the performances of (1,λ) and (1+2) Evolution-Strategies (ESs) with the aim of designing fast local search stochastic algorithms. In this paper, we investigate the impact of mirrored samples for noisy optimization. Since elitist selection is detrimental for noisy optimization, we investigate non-elitist ESs only here. We compare on the BBOB-2010 noisy benchmark testbed two variants of the (1,2)-CMA-ES where mirrored samples are implemented with the baseline (1,2)-CMA-ES. Each algorithm implements a restart mechanism. A total budget of 104 D function evaluations per trial has been used, where D is the dimension of the search space.. The experiments clearly show a ranking among the three algorithms: both mirroring variants have lower expected running times than the (1,2)-CMA-ES by at least 50% on 5 functions and they solve three additional functions in 20D that the (1,2)-CMA-ES cannot solve (or only with small probability). The comparison between the two mirroring variants is in favor of the algorithm employing a sequential selection in addition--outperforming the algorithm with only mirrored samples on five functions by at least 17% whereas no statistically significant worsening can be observed. Both algorithms using mirrored samples also outperform the function-wise best algorithm of the BBOB-2009 benchmarking on three (respectively four) functions comprising Cauchy noise by up to 65%.