When resampling to cope with noise, use median, not mean

Due to their randomized nature, many nature-inspired heuristics are robust to some level of noise in the fitness evaluations. A common strategy to increase the tolerance to noise is to re-evaluate the fitness of a solution candidate several times and to then work with the average of the sampled fitness values. In this work, we propose to use the median instead of the mean. Besides being invariant to rescalings of the fitness, the median in many situations turns out to be much more robust than the mean. We show that when the noisy fitness is ϵ-concentrated, then a logarithmic number of samples suffice to discover the undisturbed fitness (via the median of the samples) with high probability. This gives a simple metaheuristic approach to transform a randomized optimization heuristics into one that is robust to this type of noise and that has a runtime higher than the original one only by a logarithmic factor. We show further that ϵ-concentrated noise occurs frequently in standard situations. We also provide lower bounds showing that in two such situations, even with larger numbers of samples, the average-resample strategy cannot efficiently optimize the problem in polynomial time.

[1]  Frank Neumann,et al.  A rigorous view on neutrality , 2007, 2007 IEEE Congress on Evolutionary Computation.

[2]  Hans-Georg Beyer,et al.  Efficiency and mutation strength adaptation of the (μ/μI, λ)-ES in a noisy environment , 2000 .

[3]  Olivier Teytaud,et al.  Bandit-Based Estimation of Distribution Algorithms for Noisy Optimization: Rigorous Runtime Analysis , 2010, LION.

[4]  Dirk Sudholt,et al.  A Simple Ant Colony Optimizer for Stochastic Shortest Path Problems , 2012, Algorithmica.

[5]  Benjamin Doerr,et al.  Better Runtime Guarantees via Stochastic Domination , 2018, EvoCOP.

[6]  Stefan Droste,et al.  Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods Analysis of the (1+1) EA for a Noisy OneMax , 2004 .

[7]  Mark E. J. Newman,et al.  Power-Law Distributions in Empirical Data , 2007, SIAM Rev..

[8]  Adam Prügel-Bennett,et al.  Run-Time Analysis of Population-Based Evolutionary Algorithm in Noisy Environments , 2015, FOGA.

[9]  Peter Stagge,et al.  Averaging Efficiently in the Presence of Noise , 1998, PPSN.

[10]  Andrew M. Sutton,et al.  The Compact Genetic Algorithm is Efficient Under Extreme Gaussian Noise , 2017, IEEE Transactions on Evolutionary Computation.

[11]  Jürgen Branke,et al.  Selection in the Presence of Noise , 2003, GECCO.

[12]  Olivier Teytaud,et al.  Analysis of runtime of optimization algorithms for noisy functions over discrete codomains , 2015, Theor. Comput. Sci..

[13]  Timo Kötzing,et al.  Optimizing expected path lengths with ant colony optimization using fitness proportional update , 2013, FOGA XII '13.

[14]  Claudia Biermann,et al.  Mathematical Methods Of Statistics , 2016 .

[15]  Dirk Sudholt,et al.  Design and analysis of migration in parallel evolutionary algorithms , 2013, Soft Comput..

[16]  Christian Igel,et al.  Hoeffding and Bernstein races for selecting policies in evolutionary direct policy search , 2009, ICML '09.

[17]  Chao Qian,et al.  Running Time Analysis of the (1+1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$1+1$$\end{document})-EA for OneMax an , 2017, Algorithmica.

[18]  Benjamin W. Wah,et al.  Scheduling of Genetic Algorithms in a Noisy Environment , 1994, Evolutionary Computation.

[19]  J. Fitzpatrick,et al.  Genetic Algorithms in Noisy Environments , 2005, Machine Learning.

[20]  Andrew M. Sutton,et al.  Graceful Scaling on Uniform Versus Steep-Tailed Noise , 2016, PPSN.

[21]  Chao Qian,et al.  Running time analysis of the (1+1)-EA for onemax and leadingones under bit-wise noise , 2017, GECCO.

[22]  Georg Ch. Pflug,et al.  Simulated Annealing for noisy cost functions , 1996, J. Glob. Optim..

[23]  Jürgen Branke,et al.  Sequential Sampling in Noisy Environments , 2004, PPSN.

[24]  Xin Yao,et al.  On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments , 2014, Evolutionary Computation.

[25]  Benjamin Doerr,et al.  Ants easily solve stochastic shortest path problems , 2012, GECCO '12.

[26]  Andrew M. Sutton,et al.  Robustness of Ant Colony Optimization to Noise , 2015, Evolutionary Computation.

[27]  Kalyanmoy Deb,et al.  Genetic Algorithms, Noise, and the Sizing of Populations , 1992, Complex Syst..

[28]  Frank Neumann,et al.  Optimal Fixed and Adaptive Mutation Rates for the LeadingOnes Problem , 2010, PPSN.