Uncrowded Hypervolume-Based Multiobjective Optimization with Gene-Pool Optimal Mixing

Domination-based multi-objective (MO) evolutionary algorithms (EAs) are today arguably the most frequently used type of MOEA. These methods however stagnate when the majority of the population becomes non-dominated, preventing convergence to the Pareto set. Hypervolume-based MO optimization has shown promising results to overcome this. Direct use of the hypervolume however results in no selection pressure for dominated solutions. The recently introduced Sofomore framework overcomes this by solving multiple interleaved single-objective dynamic problems that iteratively improve a single approximation set, based on the uncrowded hypervolume improvement (UHVI). It thereby however loses many advantages of population-based MO optimization, such as handling multimodality. Here, we reformulate the UHVI as a quality measure for approximation sets, called the uncrowded hypervolume (UHV), which can be used to directly solve MO optimization problems with a single-objective optimizer. We use the state-of-the-art gene-pool optimal mixing evolutionary algorithm (GOMEA) that is capable of efficiently exploiting the intrinsically available grey-box properties of this problem. The resulting algorithm, UHV-GOMEA, is compared to Sofomore equipped with GOMEA, and the domination-based MO-GOMEA. In doing so, we investigate in which scenarios either domination-based or hypervolume-based methods are preferred. Finally, we construct a simple hybrid approach that combines MO-GOMEA with UHV-GOMEA and outperforms both.

[1]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.

[2]  Carlos M. Fonseca,et al.  An Improved Dimension-Sweep Algorithm for the Hypervolume Indicator , 2006, 2006 IEEE International Conference on Evolutionary Computation.

[3]  Joshua D. Knowles Local-search and hybrid evolutionary algorithms for Pareto optimization , 2002 .

[4]  Olivier Ledoit,et al.  Nonlinear Shrinkage Estimation of Large-Dimensional Covariance Matrices , 2011, 1207.5322.

[5]  Hisao Ishibuchi,et al.  A Decomposition-Based Evolutionary Algorithm for Multi-modal Multi-objective Optimization , 2018, PPSN.

[6]  Dirk Thierens,et al.  The balance between proximity and diversity in multiobjective evolutionary algorithms , 2003, IEEE Trans. Evol. Comput..

[7]  Anne Auger,et al.  Uncrowded hypervolume improvement: COMO-CMA-ES and the sofomore framework , 2019, GECCO.

[8]  Pascal Kerschke,et al.  An Expedition to Multimodal Multi-objective Optimization Landscapes , 2017, EMO.

[9]  Marco Laumanns,et al.  Scalable multi-objective optimization test problems , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[10]  Mark Fleischer,et al.  The measure of pareto optima: Applications to multi-objective metaheuristics , 2003 .

[11]  Jonathan E. Fieldsend,et al.  Efficient real-time hypervolume estimation with monotonically reducing error , 2019, GECCO.

[12]  Anne Auger,et al.  Theory of the hypervolume indicator: optimal μ-distributions and the choice of the reference point , 2009, FOGA '09.

[13]  S. Marée Correcting Non Positive Definite Correlation Matrices , 2012 .

[14]  Eckart Zitzler,et al.  HypE: An Algorithm for Fast Hypervolume-Based Many-Objective Optimization , 2011, Evolutionary Computation.

[15]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[16]  Peter A. N. Bosman,et al.  Elitist Archiving for Multi-Objective Evolutionary Algorithms: To Adapt or Not to Adapt , 2012, PPSN.

[17]  Dirk Thierens,et al.  Benchmarking Parameter-Free AMaLGaM on Functions With and Without Noise , 2013, Evolutionary Computation.

[18]  Dirk Thierens,et al.  Multi-objective optimization with diversity preserving mixture-based iterated density estimation evolutionary algorithms , 2002, Int. J. Approx. Reason..

[19]  M. Fleischer,et al.  The Measure of Pareto Optima , 2003, EMO.

[20]  Eckart Zitzler,et al.  Indicator-Based Selection in Multiobjective Search , 2004, PPSN.

[21]  M. Hansen,et al.  Evaluating the quality of approximations to the non-dominated set , 1998 .

[22]  Lothar Thiele,et al.  Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach , 1999, IEEE Trans. Evol. Comput..

[23]  Peter A. N. Bosman,et al.  Exploiting linkage information in real-valued optimization with the real-valued gene-pool optimal mixing evolutionary algorithm , 2017, GECCO.

[24]  Carlos M. Fonseca,et al.  Greedy Hypervolume Subset Selection in Low Dimensions , 2016, Evolutionary Computation.

[25]  Nicola Beume,et al.  SMS-EMOA: Multiobjective selection based on dominated hypervolume , 2007, Eur. J. Oper. Res..

[26]  Marco Laumanns,et al.  SPEA2: Improving the Strength Pareto Evolutionary Algorithm For Multiobjective Optimization , 2002 .

[27]  Audra E. Kosh,et al.  Linear Algebra and its Applications , 1992 .

[28]  Peter A. N. Bosman,et al.  The multi-objective real-valued gene-pool optimal mixing evolutionary algorithm , 2017, GECCO.

[29]  Marco Laumanns,et al.  Performance assessment of multiobjective optimizers: an analysis and review , 2003, IEEE Trans. Evol. Comput..

[30]  Lothar Thiele,et al.  A Tutorial on the Performance Assessment of Stochastic Multiobjective Optimizers , 2006 .

[31]  Peter A. N. Bosman,et al.  Real-valued evolutionary multi-modal multi-objective optimization by hill-valley clustering , 2019, GECCO.

[32]  A. Shamsai,et al.  Multi-objective Optimization , 2017, Encyclopedia of Machine Learning and Data Mining.

[33]  R. Lyndon While,et al.  A Scalable Multi-objective Test Problem Toolkit , 2005, EMO.

[34]  Jürgen Branke,et al.  Multi-objective particle swarm optimization on computer grids , 2007, GECCO '07.

[35]  Stefan Roth,et al.  Covariance Matrix Adaptation for Multi-objective Optimization , 2007, Evolutionary Computation.

[36]  Frank Neumann,et al.  Convergence of set-based multi-objective optimization, indicators and deteriorative cycles , 2012, Theor. Comput. Sci..

[37]  Nicola Beume,et al.  An EMO Algorithm Using the Hypervolume Measure as Selection Criterion , 2005, EMO.

[38]  Nicola Beume,et al.  Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods Gradient-based / Evolutionary Relay Hybrid for Computing Pareto Front Approximations Maximizing the S-Metric , 2007 .

[39]  Lothar Thiele,et al.  The Hypervolume Indicator Revisited: On the Design of Pareto-compliant Indicators Via Weighted Integration , 2007, EMO.

[40]  Matteo Nicolini,et al.  A Two-Level Evolutionary Approach to Multi-criterion Optimization of Water Supply Systems , 2005, EMO.

[41]  Nicholas J. Higham,et al.  Cholesky factorization , 2009 .

[42]  Shlomo Moran,et al.  Optimal implementations of UPGMA and other common clustering algorithms , 2007, Inf. Process. Lett..

[43]  Hisao Ishibuchi,et al.  Modified Distance Calculation in Generational Distance and Inverted Generational Distance , 2015, EMO.

[44]  Nicola Beume,et al.  On the Complexity of Computing the Hypervolume Indicator , 2009, IEEE Transactions on Evolutionary Computation.