Necessary and Sufficient Conditions for Surrogate Functions of Pareto Frontiers and Their Synthesis Using Gaussian Processes

This paper introduces necessary and sufficient conditions that surrogate functions must satisfy to properly define frontiers of nondominated solutions in multiobjective optimization (MOO) problems. These new conditions work directly on the objective space, and thus are agnostic about how the solutions are evaluated. Therefore, real objectives or user-designed objectives’ surrogates are allowed, opening the possibility of linking independent objective surrogates. To illustrate the practical consequences of adopting the proposed conditions, we use Gaussian processes (GPs) as surrogates endowed with monotonicity soft constraints and with an adjustable degree of flexibility, and compare them to regular GPs and to a frontier surrogate method in the literature that is the closest to the method proposed in this paper. Results show that the necessary and sufficient conditions proposed here are finely managed by the constrained GP, guiding to high-quality surrogates capable of suitably synthesizing an approximation to the Pareto frontier in challenging instances of MOO, while an existing approach that does not take the theory proposed in consideration defines surrogates which greatly violate the conditions to describe a valid frontier.

[1]  Bernd Bischl,et al.  Model-Based Multi-objective Optimization: Taxonomy, Multi-Point Proposal, Toolbox and Benchmark , 2015, EMO.

[2]  Hirotaka Nakayama,et al.  Meta-Modeling in Multiobjective Optimization , 2008, Multiobjective Optimization.

[3]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[4]  Edwin V. Bonilla,et al.  Multi-task Gaussian Process Prediction , 2007, NIPS.

[5]  Xavier Gandibleux,et al.  Multiple Criteria Optimization: State of the Art Annotated Bibliographic Surveys , 2013 .

[6]  Carlos M. Fonseca,et al.  The Attainment-Function Approach to Stochastic Multiobjective Optimizer Assessment and Comparison , 2010, Experimental Methods for the Analysis of Optimization Algorithms.

[7]  Qingfu Zhang,et al.  Expensive Multiobjective Optimization by MOEA/D With Gaussian Process Model , 2010, IEEE Transactions on Evolutionary Computation.

[8]  Aki Vehtari,et al.  Gaussian processes with monotonicity information , 2010, AISTATS.

[9]  Yaochu Jin,et al.  A comprehensive survey of fitness approximation in evolutionary computation , 2005, Soft Comput..

[10]  Christopher M. Bishop,et al.  Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .

[11]  Concha Bielza,et al.  Multiobjective Estimation of Distribution Algorithm Based on Joint Modeling of Objectives and Variables , 2014, IEEE Transactions on Evolutionary Computation.

[12]  A. P. Dawid,et al.  Gaussian Processes to Speed up Hybrid Monte Carlo for Expensive Bayesian Integrals , 2003 .

[13]  Andy J. Keane,et al.  Multi-Objective Optimization Using Surrogates , 2010 .

[14]  DebK.,et al.  A fast and elitist multiobjective genetic algorithm , 2002 .

[15]  Michèle Sebag,et al.  A mono surrogate for multiobjective optimization , 2010, GECCO '10.

[16]  Michèle Sebag,et al.  Dominance-Based Pareto-Surrogate for Multi-Objective Optimization , 2010, SEAL.

[17]  Joshua D. Knowles,et al.  ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems , 2006, IEEE Transactions on Evolutionary Computation.

[18]  R. Lyndon While,et al.  A review of multiobjective test problems and a scalable test problem toolkit , 2006, IEEE Transactions on Evolutionary Computation.

[19]  Tom Minka,et al.  Expectation Propagation for approximate Bayesian inference , 2001, UAI.

[20]  Marco Laumanns,et al.  Performance assessment of multiobjective optimizers: an analysis and review , 2003, IEEE Trans. Evol. Comput..

[21]  R. K. Ursem Multi-objective Optimization using Evolutionary Algorithms , 2009 .

[22]  Nicola Beume,et al.  SMS-EMOA: Multiobjective selection based on dominated hypervolume , 2007, Eur. J. Oper. Res..

[23]  Gary B. Lamont,et al.  Evolutionary Algorithms for Solving Multi-Objective Problems , 2002, Genetic Algorithms and Evolutionary Computation.

[24]  Hirotaka Nakayama,et al.  GENERATION OF PARETO FRONTIERS USING SUPPORT VECTOR MACHINE , 2004 .

[25]  Qingfu Zhang,et al.  A Gaussian Process Surrogate Model Assisted Evolutionary Algorithm for Medium Scale Expensive Optimization Problems , 2014, IEEE Transactions on Evolutionary Computation.

[26]  Joshua D. Knowles,et al.  Multiobjective Optimization on a Budget of 250 Evaluations , 2005, EMO.

[27]  Wolfgang Ponweiser,et al.  Multiobjective Optimization on a Limited Budget of Evaluations Using Model-Assisted -Metric Selection , 2008, PPSN.

[28]  Ye Tian,et al.  An Efficient Approach to Nondominated Sorting for Evolutionary Multiobjective Optimization , 2015, IEEE Transactions on Evolutionary Computation.

[29]  M. Emmerich,et al.  The computation of the expected improvement in dominated hypervolume of Pareto front approximations , 2008 .

[30]  C. Rasmussen,et al.  Gaussian Process Priors with Uncertain Inputs - Application to Multiple-Step Ahead Time Series Forecasting , 2002, NIPS.

[31]  Michael T. M. Emmerich,et al.  Single- and multiobjective evolutionary optimization assisted by Gaussian random field metamodels , 2006, IEEE Transactions on Evolutionary Computation.

[32]  A. Shamsai,et al.  Multi-objective Optimization , 2017, Encyclopedia of Machine Learning and Data Mining.

[33]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[34]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[35]  Kaisa Miettinen,et al.  Nonlinear multiobjective optimization , 1998, International series in operations research and management science.

[36]  David W. Corne,et al.  Noisy Multiobjective Optimization on a Budget of 250 Evaluations , 2009, EMO.