Adaptive Sampling of Pareto Frontiers with Binary Constraints Using Regression and Classification

We present a novel adaptive optimization algorithm for black-box multi-objective optimization problems with binary constraints on the foundation of Bayes optimization. Our method is based on probabilistic regression and classification models, which act as a surrogate for the optimization goals and allow us to suggest multiple design points at once in each iteration. The proposed acquisition function is intuitively understandable and can be tuned to the demands of the problems at hand. We also present a novel ellipsoid truncation method to speed up the expected hypervolume calculation in a straightforward way for regression models with a normal probability density. We benchmark our approach with an evolutionary algorithm on multiple test problems.

[1]  Bernhard Schölkopf,et al.  The Kernel Trick for Distances , 2000, NIPS.

[2]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[3]  Michael Bortz,et al.  Optimized data exploration applied to the simulation of a chemical process , 2019, Comput. Chem. Eng..

[4]  J. Peters,et al.  Pareto Front Modeling for Sensitivity Analysis in Multi-Objective Bayesian Optimization , 2014 .

[5]  A. Osyczka,et al.  A new method to solve generalized multicriteria optimization problems using the simple genetic algorithm , 1995 .

[6]  Peter J. Fleming,et al.  An Overview of Evolutionary Algorithms in Multiobjective Optimization , 1995, Evolutionary Computation.

[7]  Thomas Bäck,et al.  Efficient computation of expected hypervolume improvement using box decomposition algorithms , 2019, Journal of Global Optimization.

[8]  Kalyanmoy Deb,et al.  Multi-objective optimization using evolutionary algorithms , 2001, Wiley-Interscience series in systems and optimization.

[9]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[10]  Jorge Nocedal,et al.  A Limited Memory Algorithm for Bound Constrained Optimization , 1995, SIAM J. Sci. Comput..

[11]  T. T. Binh MOBES : A multiobjective evolution strategy for constrained optimization problems , 1997 .

[12]  Yacov Y. Haimes,et al.  Multiobjective Decision Making: Theory and Methodology , 1983 .

[13]  M. Emmerich,et al.  The computation of the expected improvement in dominated hypervolume of Pareto front approximations , 2008 .

[14]  Peter I. Frazier,et al.  A Tutorial on Bayesian Optimization , 2018, ArXiv.

[15]  Kathrin Klamroth,et al.  Efficient computation of the search region in multi-objective optimization , 2017, Eur. J. Oper. Res..

[16]  Svetha Venkatesh,et al.  Expected Hypervolume Improvement with Constraints , 2018, 2018 24th International Conference on Pattern Recognition (ICPR).

[17]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[18]  Joel Nothman,et al.  SciPy 1.0-Fundamental Algorithms for Scientific Computing in Python , 2019, ArXiv.