Efficient multiobjective optimization employing Gaussian processes, spectral sampling and a genetic algorithm

Many engineering problems require the optimization of expensive, black-box functions involving multiple conflicting criteria, such that commonly used methods like multiobjective genetic algorithms are inadequate. To tackle this problem several algorithms have been developed using surrogates. However, these often have disadvantages such as the requirement of a priori knowledge of the output functions or exponentially scaling computational cost with respect to the number of objectives. In this paper a new algorithm is proposed, TSEMO, which uses Gaussian processes as surrogates. The Gaussian processes are sampled using spectral sampling techniques to make use of Thompson sampling in conjunction with the hypervolume quality indicator and NSGA-II to choose a new evaluation point at each iteration. The reference point required for the hypervolume calculation is estimated within TSEMO. Further, a simple extension was proposed to carry out batch-sequential design. TSEMO was compared to ParEGO, an expected hypervolume implementation, and NSGA-II on nine test problems with a budget of 150 function evaluations. Overall, TSEMO shows promising performance, while giving a simple algorithm without the requirement of a priori knowledge, reduced hypervolume calculations to approach linear scaling with respect to the number of objectives, the capacity to handle noise and lastly the ability for batch-sequential usage.

[1]  Andy J. Keane,et al.  Statistical Improvement Criteria for Use in Multiobjective Design Optimization , 2006 .

[2]  J. D. Schaffer,et al.  Some experiments in machine learning using vector evaluated genetic algorithms (artificial intelligence, optimization, adaptation, pattern recognition) , 1984 .

[3]  Qingfu Zhang,et al.  Combining Model-based and Genetics-based Offspring Generation for Multi-objective Optimization Using a Convergence Criterion , 2006, 2006 IEEE International Conference on Evolutionary Computation.

[4]  Andy J. Keane,et al.  Recent advances in surrogate-based optimization , 2009 .

[5]  Y. Censor Pareto optimality in multiobjective problems , 1977 .

[6]  Rémi Munos,et al.  Thompson Sampling: An Asymptotically Optimal Finite-Time Analysis , 2012, ALT.

[7]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[8]  Richard J. Beckman,et al.  A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output From a Computer Code , 2000, Technometrics.

[9]  Hao Wang,et al.  A Multicriteria Generalization of Bayesian Global Optimization , 2016, Advances in Stochastic and Deterministic Global Optimization.

[10]  Carlos M. Fonseca,et al.  Computing Hypervolume Contributions in Low Dimensions: Asymptotically Optimal Algorithm and Complexity Results , 2011, EMO.

[11]  Shipra Agrawal,et al.  Thompson Sampling for Contextual Bandits with Linear Payoffs , 2012, ICML.

[12]  Matthew W. Hoffman,et al.  An Entropy Search Portfolio for Bayesian Optimization , 2014, ArXiv.

[13]  Piet Demeester,et al.  A Surrogate Modeling and Adaptive Sampling Toolbox for Computer Based Design , 2010, J. Mach. Learn. Res..

[14]  Riccardo Poli,et al.  Particle swarm optimization , 1995, Swarm Intelligence.

[15]  Carlos M. Fonseca,et al.  Computing 3-D Expected Hypervolume Improvement and Related Integrals in Asymptotically Optimal Time , 2017, EMO.

[16]  Artur M. Schweidtmann,et al.  A Multiobjective Optimization Including Results of Life Cycle Assessment in Developing Biorenewables-Based Processes. , 2017, ChemSusChem.

[17]  M. Emmerich,et al.  The computation of the expected improvement in dominated hypervolume of Pareto front approximations , 2008 .

[18]  Thomas J. Santner,et al.  Design and analysis of computer experiments , 1998 .

[19]  M. Ehrgott Multiobjective Optimization , 2008, AI Mag..

[20]  Lihong Li,et al.  An Empirical Evaluation of Thompson Sampling , 2011, NIPS.

[21]  Michael T. M. Emmerich,et al.  Faster Computation of Expected Hypervolume Improvement , 2014, ArXiv.

[22]  Michael T. M. Emmerich,et al.  Faster Exact Algorithms for Computing Expected Hypervolume Improvement , 2015, EMO.

[23]  C. Hwang Simulated annealing: Theory and applications , 1988, Acta Applicandae Mathematicae - An International Survey Journal on Applying Mathematics and Mathematical Applications.

[24]  Andy J. Keane,et al.  Multi-Objective Optimization Using Surrogates , 2010 .

[25]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[26]  Michael T. M. Emmerich,et al.  Hypervolume-based expected improvement: Monotonicity properties and exact computation , 2011, 2011 IEEE Congress of Evolutionary Computation (CEC).

[27]  Marco Laumanns,et al.  Performance assessment of multiobjective optimizers: an analysis and review , 2003, IEEE Trans. Evol. Comput..

[28]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[29]  M. Stein Large sample properties of simulations using latin hypercube sampling , 1987 .

[30]  Harold J. Kushner,et al.  A New Method of Locating the Maximum Point of an Arbitrary Multipeak Curve in the Presence of Noise , 1964 .

[31]  Kim Fung Man,et al.  Multiobjective Optimization , 2011, IEEE Microwave Magazine.

[32]  Hisao Ishibuchi,et al.  Modified Distance Calculation in Generational Distance and Inverted Generational Distance , 2015, EMO.

[33]  S. Bochner Lectures on Fourier Integrals. (AM-42) , 1959 .

[34]  W. R. Thompson ON THE LIKELIHOOD THAT ONE UNKNOWN PROBABILITY EXCEEDS ANOTHER IN VIEW OF THE EVIDENCE OF TWO SAMPLES , 1933 .

[35]  R. V. Churchill,et al.  Lectures on Fourier Integrals , 1959 .

[36]  Steven L. Scott,et al.  A modern Bayesian look at the multi-armed bandit , 2010 .

[37]  Jie Zhang,et al.  Consistencies and Contradictions of Performance Metrics in Multiobjective Optimization , 2014, IEEE Transactions on Cybernetics.

[38]  Lothar Thiele,et al.  A Tutorial on the Performance Assessment of Stochastic Multiobjective Optimizers , 2006 .

[39]  D. Finkel,et al.  Direct optimization algorithm user guide , 2003 .

[40]  Matthew W. Hoffman,et al.  Predictive Entropy Search for Efficient Global Optimization of Black-box Functions , 2014, NIPS.

[41]  Eckart Zitzler,et al.  Evolutionary algorithms for multiobjective optimization: methods and applications , 1999 .

[42]  A. A. Zhigli︠a︡vskiĭ,et al.  Stochastic Global Optimization , 2007 .

[43]  Joshua D. Knowles A summary-attainment-surface plotting method for visualizing the performance of stochastic multiobjective optimizers , 2005, 5th International Conference on Intelligent Systems Design and Applications (ISDA'05).

[44]  Peter J. Fleming,et al.  On the Performance Assessment and Comparison of Stochastic Multiobjective Optimizers , 1996, PPSN.

[45]  Mark Ebden,et al.  Gaussian Processes: A Quick Introduction , 2015, 1505.02965.

[46]  Joshua D. Knowles,et al.  ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems , 2006, IEEE Transactions on Evolutionary Computation.

[47]  Lawrence. Davis,et al.  Handbook Of Genetic Algorithms , 1990 .

[48]  Benjamin Recht,et al.  Random Features for Large-Scale Kernel Machines , 2007, NIPS.

[49]  Carl E. Rasmussen,et al.  Sparse Spectrum Gaussian Process Regression , 2010, J. Mach. Learn. Res..

[50]  Ren-Jye Yang,et al.  Approximation methods in multidisciplinary analysis and optimization: a panel discussion , 2004 .

[51]  Benjamín Barán,et al.  Performance metrics in multi-objective optimization , 2015, 2015 Latin American Computing Conference (CLEI).

[52]  Antanas Zilinskas,et al.  A statistical model-based algorithm for ‘black-box’ multi-objective optimisation , 2014, Int. J. Syst. Sci..

[53]  Lothar Thiele,et al.  Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach , 1999, IEEE Trans. Evol. Comput..

[54]  Donald R. Jones,et al.  A Taxonomy of Global Optimization Methods Based on Response Surfaces , 2001, J. Glob. Optim..

[55]  Thomas J. Santner,et al.  The Design and Analysis of Computer Experiments , 2003, Springer Series in Statistics.

[56]  Nando de Freitas,et al.  Taking the Human Out of the Loop: A Review of Bayesian Optimization , 2016, Proceedings of the IEEE.

[57]  XIAOKE YANG,et al.  Fault Tolerant Control Using Gaussian Processes and Model Predictive Control , 2013, 2013 Conference on Control and Fault-Tolerant Systems (SysTol).

[58]  Łukasz Łaniewski-Wołłk Relative Expected Improvement in Kriging Based Optimization , 2009 .

[59]  S. Sundararajan,et al.  Predictive Approaches for Choosing Hyperparameters in Gaussian Processes , 1999, Neural Computation.

[60]  Hannes Nickisch,et al.  The GPML Toolbox version 4 . 2 , 2016 .

[61]  Michael T. M. Emmerich,et al.  Single- and multiobjective evolutionary optimization assisted by Gaussian random field metamodels , 2006, IEEE Transactions on Evolutionary Computation.

[62]  Bernard Manderick,et al.  Thompson Sampling for Multi-Objective Multi-Armed Bandits Problem , 2015, ESANN.

[63]  R. Farmani,et al.  Evolutionary multi-objective optimization in water distribution network design , 2005 .

[64]  Jonas Mockus,et al.  On Bayesian Methods for Seeking the Extremum , 1974, Optimization Techniques.

[65]  A. Zhigljavsky Stochastic Global Optimization , 2008, International Encyclopedia of Statistical Science.

[66]  Kalyanmoy Deb,et al.  Faster Hypervolume-Based Search Using Monte Carlo Sampling , 2008, MCDM.

[67]  Wolfgang Ponweiser,et al.  Multiobjective Optimization on a Limited Budget of Evaluations Using Model-Assisted -Metric Selection , 2008, PPSN.