Covariance matrix adaptive strategy for a multi-objective evolutionary algorithm based on reference point

In this article, an effective method, called an adaptive covariance strategy based on reference points (RPCMA-ES) is proposed for multi-objective optimization. In the proposed algorithm, search space is divided into independent sub-regions by calculating the angle between the objective vector and the reference vector. The reference vectors can be used not only to decompose the original multi-objective optimization problem into a number of single-objective subproblems, but also to elucidate user preferences to target a preferred subset of the whole Pareto front (PF). In this respect, any single objective optimizers can be easily used in this algorithm framework. Inspired by the multi-objective estimation of distribution algorithms, covariance matrix adaptation evolution strategy (CMA-ES) is involved in RPCMA-ES. A state-of-the-art optimizer for single-objective continuous functions is the CMA-ES, which has proven to be able to strike a good balance between the exploration and the exploitation of search space. Furthermore, in order to avoid falling into local optimality and make the new mean closer to the optimal solution, chaos operator is added based on CMA-ES. By comparing it with four state-of-the-art multi-objective optimization algorithms, the simulation results show that the proposed algorithm is competitive and effective in terms of convergence and distribution.

[1]  Dipti Srinivasan,et al.  A Survey of Multiobjective Evolutionary Algorithms Based on Decomposition , 2017, IEEE Transactions on Evolutionary Computation.

[2]  Bernhard Sendhoff,et al.  A Reference Vector Guided Evolutionary Algorithm for Many-Objective Optimization , 2016, IEEE Transactions on Evolutionary Computation.

[3]  Petros Koumoutsakos,et al.  Reducing the Time Complexity of the Derandomized Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) , 2003, Evolutionary Computation.

[4]  Fernando José Von Zuben,et al.  Learning and optimization using the clonal selection principle , 2002, IEEE Trans. Evol. Comput..

[5]  Patrick M. Reed,et al.  Borg: An Auto-Adaptive Many-Objective Evolutionary Computing Framework , 2013, Evolutionary Computation.

[6]  Ye Tian,et al.  An Indicator-Based Multiobjective Evolutionary Algorithm With Reference Point Adaptation for Better Versatility , 2018, IEEE Transactions on Evolutionary Computation.

[7]  R. Lyndon While,et al.  A review of multiobjective test problems and a scalable test problem toolkit , 2006, IEEE Transactions on Evolutionary Computation.

[8]  Stefan Roth,et al.  Covariance Matrix Adaptation for Multi-objective Optimization , 2007, Evolutionary Computation.

[9]  Nikolaus Hansen,et al.  A restart CMA evolution strategy with increasing population size , 2005, 2005 IEEE Congress on Evolutionary Computation.

[10]  Hong Li,et al.  MOEA/D + uniform design: A new version of MOEA/D for optimization problems with many objectives , 2013, Comput. Oper. Res..

[11]  Jinhua Zheng,et al.  Achieving balance between proximity and diversity in multi-objective evolutionary algorithm , 2012, Inf. Sci..

[12]  Tao Zhang,et al.  On the effect of reference point in MOEA/D for multi-objective optimization , 2017, Appl. Soft Comput..

[13]  Qingfu Zhang,et al.  Adaptive Replacement Strategies for MOEA/D , 2016, IEEE Transactions on Cybernetics.

[14]  Marco Laumanns,et al.  Combining Convergence and Diversity in Evolutionary Multiobjective Optimization , 2002, Evolutionary Computation.

[15]  Qingfu Zhang,et al.  Multiobjective evolutionary algorithms: A survey of the state of the art , 2011, Swarm Evol. Comput..

[16]  Qingfu Zhang,et al.  An Evolutionary Many-Objective Optimization Algorithm Based on Dominance and Decomposition , 2015, IEEE Transactions on Evolutionary Computation.

[17]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[18]  Gary B. Lamont,et al.  Evolutionary Algorithms for Solving Multi-Objective Problems , 2002, Genetic Algorithms and Evolutionary Computation.

[19]  Kalyanmoy Deb,et al.  An Evolutionary Many-Objective Optimization Algorithm Using Reference-Point-Based Nondominated Sorting Approach, Part I: Solving Problems With Box Constraints , 2014, IEEE Transactions on Evolutionary Computation.

[20]  Eckart Zitzler,et al.  Indicator-Based Selection in Multiobjective Search , 2004, PPSN.

[21]  Carlos A. Coello Coello,et al.  Using the Averaged Hausdorff Distance as a Performance Measure in Evolutionary Multiobjective Optimization , 2012, IEEE Transactions on Evolutionary Computation.

[22]  Qingfu Zhang,et al.  Biased Multiobjective Optimization and Decomposition Algorithm , 2017, IEEE Transactions on Cybernetics.

[23]  J. Cornell Experiments with Mixtures: Designs, Models and the Analysis of Mixture Data , 1982 .

[24]  Günter Rudolph,et al.  Convergence analysis of canonical genetic algorithms , 1994, IEEE Trans. Neural Networks.

[25]  Qingfu Zhang,et al.  MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition , 2007, IEEE Transactions on Evolutionary Computation.

[26]  Kalyanmoy Deb,et al.  Simulated Binary Crossover for Continuous Search Space , 1995, Complex Syst..

[27]  Xin Li,et al.  A Hybrid Multiobjective Particle Swarm Optimization Algorithm Based on R2 Indicator , 2018, IEEE Access.

[28]  Liu Jianqin,et al.  Premature convergence in genetic algorithm: analysis and prevention based on chaos operator , 2000, Proceedings of the 3rd World Congress on Intelligent Control and Automation (Cat. No.00EX393).

[29]  John E. Dennis,et al.  Normal-Boundary Intersection: A New Method for Generating the Pareto Surface in Nonlinear Multicriteria Optimization Problems , 1998, SIAM J. Optim..

[30]  Xin Li,et al.  A novel multi-objective PSO algorithm based on completion-checking , 2018, J. Intell. Fuzzy Syst..

[31]  Andrzej Jaszkiewicz,et al.  On the performance of multiple-objective genetic local search on the 0/1 knapsack problem - a comparative experiment , 2002, IEEE Trans. Evol. Comput..

[32]  Qingfu Zhang,et al.  Multiobjective Optimization Problems With Complicated Pareto Sets, MOEA/D and NSGA-II , 2009, IEEE Transactions on Evolutionary Computation.

[33]  Dirk Thierens,et al.  The balance between proximity and diversity in multiobjective evolutionary algorithms , 2003, IEEE Trans. Evol. Comput..

[34]  Peter J. Fleming,et al.  Methods for multi-objective optimization: An analysis , 2015, Inf. Sci..

[35]  Nikolaus Hansen,et al.  Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation , 1996, Proceedings of IEEE International Conference on Evolutionary Computation.

[36]  Yu Wu,et al.  A multiobjective optimization-based sparse extreme learning machine algorithm , 2018, Neurocomputing.

[37]  Lothar Thiele,et al.  Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach , 1999, IEEE Trans. Evol. Comput..

[38]  Petros Koumoutsakos,et al.  Increasing the Serial and the Parallel Performance of the CMA-Evolution Strategy with Large Populations , 2002, PPSN.

[39]  Marco Laumanns,et al.  Performance assessment of multiobjective optimizers: an analysis and review , 2003, IEEE Trans. Evol. Comput..

[40]  R. Lyndon While,et al.  A faster algorithm for calculating hypervolume , 2006, IEEE Transactions on Evolutionary Computation.

[41]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.

[42]  Kalyanmoy Deb,et al.  Multi-objective optimization using evolutionary algorithms , 2001, Wiley-Interscience series in systems and optimization.

[43]  Slim Bechikh,et al.  A New Decomposition-Based NSGA-II for Many-Objective Optimization , 2018, IEEE Transactions on Systems, Man, and Cybernetics: Systems.