Dynamic Resampling for Preference-based Evolutionary Multi-Objective Optimization of Stochastic Systems

In Multi-objective Optimization many solutions have to be evaluated in order to provide the decision maker with a diverse choice of solutions along the Pareto-front. In Simulation-based Optimization the number of optimization function evaluations is usually very limited due to the long execution times of the simulation models. If preference information is available however, the available number of function evaluations can be used more effectively. The optimization can be performed as a guided, focused search which returns solutions close to interesting, preferred regions of the Pareto-front. One such algorithm for guided search is the Reference-point guided Non-dominated Sorting Genetic Algorithm II, R-NSGA-II. It is a population-based Evolutionary Algorithm that finds a set of non-dominated solutions in a single optimization run. R-NSGA-II takes reference points in the objective space provided by the decision maker and guides the optimization towards areas of the Pareto-front close the reference points.In Simulation-based Optimization the modeled and simulated systems are often stochastic and a common method to handle objective noise is Resampling. Reliable quality assessment of system configurations by resampling requires many simulation runs. Therefore, the optimization process can benefit from Dynamic Resampling algorithms that distribute the available function evaluations among the solutions in the best possible way. Solutions can vary in their sampling need. For example, solutions with highly variable objective values have to be sampled more times to reduce their objective value standard error. Dynamic resampling algorithms assign as much samples to them as is needed to reduce the uncertainty about their objective values below a certain threshold. Another criterion the number of samples can be based on is a solution's closeness to the Pareto-front. For solutions that are close to the Pareto-front it is likely that they are member of the final result set. It is therefore important to have accurate knowledge of their objective values available, in order to be able to to tell which solutions are better than others. Usually, the distance to the Pareto-front is not known, but another criterion can be used as an indication for it instead: The elapsed optimization time. A third example of a resampling criterion can be the dominance relations between different solutions. The optimization algorithm has to determine for pairs of solutions which is the better one. Here both distances between objective vectors and the variance of the objective values have to be considered which requires a more advanced resampling technique. This is a Ranking and Selection problem.If R-NSGA-II is applied in a scenario with a stochastic fitness function resampling algorithms have to be used to support it in the best way and avoid a performance degradation due to uncertain knowledge about the objective values of solutions. In our work we combine R-NSGA-II with several resampling algorithms that are based on the above mentioned resampling criteria or combinations thereof and evaluate which are the best criteria the sampling allocation can be based on, in which situations.Due to the preference information R-NSGA-II has an important fitness information about the solutions at its disposal: The distance to reference points. We propose a resampling strategy that allocates more samples to solutions close to a reference point. This idea is then extended with a resampling technique that compares solutions based on their distance to the reference point. We base this algorithm on a classical Ranking and Selection algorithm, Optimal Computing Budget Allocation, and show how OCBA can be applied to support R-NSGA-II. We show the applicability of the proposed algorithms in a case study of an industrial production line for car manufacturing.

[1]  R. Lyndon While,et al.  Applying evolutionary algorithms to problems with noisy, time-consuming fitness functions , 2004, Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753).

[2]  Diomidis Spinellis,et al.  Analysis and Design of Discrete Part Production Lines , 2009, Optimization and its applications.

[3]  Robert Ivor John,et al.  Evolutionary optimisation of noisy multi-objective problems using confidence-based dynamic resampling , 2010, Eur. J. Oper. Res..

[4]  L. Lee,et al.  Design sampling and replication assignment under fixed computing budget , 2005 .

[5]  Kaisa Miettinen,et al.  Nonlinear multiobjective optimization , 1998, International series in operations research and management science.

[6]  Kay Chen Tan,et al.  Evolutionary Multi-objective Optimization in Uncertain Environments - Issues and Algorithms , 2009, Studies in Computational Intelligence.

[7]  Kalyanmoy Deb,et al.  An Interactive Evolutionary Multiobjective Optimization Method Based on Progressively Approximated Value Functions , 2010, IEEE Transactions on Evolutionary Computation.

[8]  Loo Hay Lee,et al.  Multi-objective simulation-based evolutionary algorithm for an aircraft spare parts allocation problem , 2008, Eur. J. Oper. Res..

[9]  Semyon M. Meerkov,et al.  How lean can lean buffers be? , 2005 .

[10]  Jonathan E. Fieldsend,et al.  The Rolling Tide Evolutionary Algorithm: A Multiobjective Optimizer for Noisy Optimization Problems , 2015, IEEE Transactions on Evolutionary Computation.

[11]  Hamidreza Eskandari,et al.  Handling uncertainty in evolutionary multiobjective optimization: SPGA , 2007, 2007 IEEE Congress on Evolutionary Computation.

[12]  Jürgen Branke,et al.  Evolutionary optimization in uncertain environments-a survey , 2005, IEEE Transactions on Evolutionary Computation.

[13]  E. Zitzler,et al.  Directed Multiobjective Optimization Based on the Weighted Hypervolume Indicator , 2013 .

[14]  Jonathan E. Fieldsend Elite Accumulative Sampling Strategies for Noisy Multi-objective Optimisation , 2015, EMO.

[15]  Benjamin W. Wah,et al.  Dynamic Control of Genetic Algorithms in a Noisy Environment , 1993, ICGA.

[16]  Xiaodong Li,et al.  A new performance metric for user-preference based multi-objective evolutionary algorithms , 2013, 2013 IEEE Congress on Evolutionary Computation.

[17]  Kwang Ryel Ryu,et al.  Accumulative sampling for noisy evolutionary multi-objective optimization , 2011, GECCO '11.

[18]  Norman A. Dudley,et al.  WORK-TIME DISTRIBUTIONS , 1963 .

[19]  Thomas Bartz-Beielstein,et al.  Particle Swarm Optimization and Sequential Sampling in Noisy Environments , 2007, Metaheuristics.

[20]  Loo Hay Lee,et al.  Stochastic Simulation Optimization - An Optimal Computing Budget Allocation , 2010, System Engineering and Operations Research.

[21]  Benjamin W. Wah,et al.  Scheduling of Genetic Algorithms in a Noisy Environment , 1994, Evolutionary Computation.

[22]  Jonathan E. Fieldsend,et al.  Multi-objective optimisation in the presence of uncertainty , 2005, 2005 IEEE Congress on Evolutionary Computation.

[23]  Kalyanmoy Deb,et al.  A comparative study of dynamic resampling strategies for guided Evolutionary Multi-objective Optimization , 2013, 2013 IEEE Congress on Evolutionary Computation.

[24]  Kalyanmoy Deb,et al.  Hybrid Dynamic Resampling for Guided Evolutionary Multi-Objective Optimization , 2015, EMO.

[25]  Jürgen Branke,et al.  Sequential Sampling in Noisy Environments , 2004, PPSN.

[26]  Carlos A. Coello Coello,et al.  A Study of the Parallelization of a Coevolutionary Multi-objective Evolutionary Algorithm , 2004, MICAI.

[27]  Timothy W. Simpson,et al.  Visual Steering Commands for Trade Space Exploration: User-Guided Sampling With Example , 2009, J. Comput. Inf. Sci. Eng..

[28]  Lothar Thiele,et al.  Multiobjective Optimization Using Evolutionary Algorithms - A Comparative Case Study , 1998, PPSN.

[29]  Kalyanmoy Deb,et al.  Reference point based multi-objective optimization using evolutionary algorithms , 2006, GECCO.

[30]  Kalyanmoy Deb,et al.  Introducing Robustness in Multi-Objective Optimization , 2006, Evolutionary Computation.

[31]  Lothar Thiele,et al.  Comparison of Multiobjective Evolutionary Algorithms: Empirical Results , 2000, Evolutionary Computation.

[32]  Kalyanmoy Deb,et al.  Reference point-based evolutionary multi-objective optimization for industrial systems simulation , 2012, Proceedings Title: Proceedings of the 2012 Winter Simulation Conference (WSC).

[33]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[34]  Loo Hay Lee,et al.  Efficient Simulation Budget Allocation for Selecting an Optimal Subset , 2008, INFORMS J. Comput..

[35]  Kalyanmoy Deb,et al.  R-HV: A Metric for Computing Hyper-volume for Reference Point Based EMOs , 2014, SEMCCO.

[36]  Anthony Di Pietro Optimising evolutionary strategies for problems with varying noise strength , 2007 .

[37]  L. Lee,et al.  Finding the non-dominated Pareto set for multi-objective simulation models , 2010 .

[38]  Kalyanmoy Deb,et al.  Finding a preferred diverse set of Pareto-optimal solutions for a limited number of function calls , 2012, 2012 IEEE Congress on Evolutionary Computation.

[39]  Kay Chen Tan,et al.  Handling Uncertainties in Evolutionary Multi-Objective Optimization , 2008, WCCI.