A Recommender System for Metaheuristic Algorithms for Continuous Optimization Based on Deep Recurrent Neural Networks

As revealed by the no free lunch theorem, no single algorithm can outperform any others on all classes of optimization problems. To tackle this issue, methods for recommending an existing algorithm for solving given problems have been proposed. However, existing recommendation methods for continuous optimization suffer from low practicability and transferability, mainly due to the difficulty in extracting features that can effectively describe the problem structure and lack of data for training a recommendation model. This work proposes a generic recommender system to address the above two challenges. First, a novel method is proposed to represent an analytic objective function of a continuous optimization problem as a tree, which is directly used as the features of the problem. For black-box optimization problems whose objective function is unknown, a symbolic regressor is adopted to estimate the tree structure. Second, a large number of benchmark problems are randomly created based on the proposed tree representation, providing an abundant amount of training data with various levels of difficulty. By employing a deep recurrent neural network, a recommendation model is trained to recommend a most suitable metaheuristic algorithm for white- or black-box optimization, making a significant step forward towards fully automated algorithm recommendation for continuous optimization. Experimental results on 100 000 benchmark problems show that the proposed recommendation model achieves considerably better performance than existing ones, and exhibits high transferability to real-world problems.

[1]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[2]  Ye Tian,et al.  A Strengthened Dominance Relation Considering Convergence and Diversity for Evolutionary Many-Objective Optimization , 2019, IEEE Transactions on Evolutionary Computation.

[3]  Carlos Artemio Coello-Coello,et al.  Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art , 2002 .

[4]  Wei-Yin Loh,et al.  Classification and regression trees , 2011, WIREs Data Mining Knowl. Discov..

[5]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[6]  C. L. Philip Chen,et al.  Online Decisioning Meta-Heuristic Framework for Large Scale Black-Box Optimization , 2018, ArXiv.

[7]  Ian Dennis Longstaff,et al.  A pattern recognition approach to understanding the multi-layer perception , 1987, Pattern Recognit. Lett..

[8]  Theodore Lim,et al.  SMASH: One-Shot Model Architecture Search through HyperNetworks , 2017, ICLR.

[9]  Muzaffar Eusuff,et al.  Shuffled frog-leaping algorithm: a memetic meta-heuristic for discrete optimization , 2006 .

[10]  Mengjie Zhang,et al.  Particle Swarm Optimization for Feature Selection in Classification: A Multi-Objective Approach , 2013, IEEE Transactions on Cybernetics.

[11]  Kaname Narukawa,et al.  Many-Objective Optimization of a Hybrid Car Controller , 2015, EvoApplications.

[12]  Marius Thomas Lindauer,et al.  AutoFolio: An Automatically Configured Algorithm Selector , 2015, J. Artif. Intell. Res..

[13]  Kate Smith-Miles,et al.  Performance Analysis of Continuous Black-Box Optimization Algorithms via Footprints in Instance Space , 2016, Evolutionary Computation.

[14]  Qingfu Zhang,et al.  Biased Multiobjective Optimization and Decomposition Algorithm , 2017, IEEE Transactions on Cybernetics.

[15]  Yoav Shoham,et al.  Understanding Random SAT: Beyond the Clauses-to-Variables Ratio , 2004, CP.

[16]  Kay Chen Tan,et al.  Solving Large-Scale Multiobjective Optimization Problems With Sparse Optimal Solutions via Unsupervised Neural Networks , 2020, IEEE Transactions on Cybernetics.

[17]  Venu Govindaraju,et al.  Improved k-nearest neighbor classification , 2002, Pattern Recognit..

[18]  Kalyanmoy Deb,et al.  A combined genetic adaptive search (GeneAS) for engineering design , 1996 .

[19]  Marco Dorigo,et al.  Ant colony optimization for continuous domains , 2008, Eur. J. Oper. Res..

[20]  R. Lyndon While,et al.  A review of multiobjective test problems and a scalable test problem toolkit , 2006, IEEE Transactions on Evolutionary Computation.

[21]  Riccardo Poli,et al.  A Field Guide to Genetic Programming , 2008 .

[22]  Micael Gallego,et al.  GRASP and path relinking for the max-min diversity problem , 2010, Comput. Oper. Res..

[23]  Kate Smith-Miles,et al.  Towards insightful algorithm selection for optimisation using meta-learning concepts , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[24]  Marie desJardins,et al.  What Makes Planners Predictable? , 2008, ICAPS.

[25]  Dan Guo,et al.  Data-Driven Evolutionary Optimization: An Overview and Case Studies , 2019, IEEE Transactions on Evolutionary Computation.

[26]  Jeffrey Dean,et al.  Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.

[27]  Antonin Ponsich,et al.  A Survey on Multiobjective Evolutionary Algorithms for the Solution of the Portfolio Optimization Problem and Other Finance and Economics Applications , 2013, IEEE Transactions on Evolutionary Computation.

[28]  Zhifeng Hao,et al.  An Experimental Method to Estimate Running Time of Evolutionary Algorithms for Continuous Optimization , 2020, IEEE Transactions on Evolutionary Computation.

[29]  Heike Trautmann,et al.  Automated Algorithm Selection on Continuous Black-Box Problems by Combining Exploratory Landscape Analysis and Machine Learning , 2017, Evolutionary Computation.

[30]  Ge Hong,et al.  Immune algorithm , 2002, Proceedings of the 4th World Congress on Intelligent Control and Automation (Cat. No.02EX527).

[31]  Yaochu Jin,et al.  A Competitive Swarm Optimizer for Large Scale Optimization , 2015, IEEE Transactions on Cybernetics.

[32]  Mark Hoogendoorn,et al.  Parameter Control in Evolutionary Algorithms: Trends and Challenges , 2015, IEEE Transactions on Evolutionary Computation.

[33]  Qingfu Zhang,et al.  Multiobjective optimization Test Instances for the CEC 2009 Special Session and Competition , 2009 .

[34]  Raghu Pasupathy,et al.  A Testbed of Simulation-Optimization Problems , 2006, Proceedings of the 2006 Winter Simulation Conference.

[35]  Ye Tian,et al.  A non-revisiting genetic algorithm based on a novel binary space partition tree , 2020, Inf. Sci..

[36]  Russell C. Eberhart,et al.  A new optimizer using particle swarm theory , 1995, MHS'95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science.

[37]  Christian Blum,et al.  Hybrid metaheuristics in combinatorial optimization: A survey , 2011, Appl. Soft Comput..

[38]  Hans-Georg Beyer,et al.  Evolution Under Strong Noise: A Self-Adaptive Evolution Strategy Can Reach the Lower Performance Bound - The pcCMSA-ES , 2016, PPSN.

[39]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[40]  Tong Zhang,et al.  An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods , 2001, AI Mag..

[41]  Dervis Karaboga,et al.  AN IDEA BASED ON HONEY BEE SWARM FOR NUMERICAL OPTIMIZATION , 2005 .

[42]  Michael P. Clements,et al.  A Comparison of the Forecast Performance of Markov�?Switching and Threshold Autoregressive Models of Us Gnp , 1998 .

[43]  Ye Tian,et al.  Efficient Large-Scale Multiobjective Optimization Based on a Competitive Swarm Optimizer , 2020, IEEE Transactions on Cybernetics.

[44]  Markus Olhofer,et al.  State-based representation for structural topology optimization and application to crashworthiness , 2016, 2016 IEEE Congress on Evolutionary Computation (CEC).

[45]  Xiaodong Li,et al.  Seeking Multiple Solutions: An Updated Survey on Niching Methods and Their Applications , 2017, IEEE Transactions on Evolutionary Computation.

[46]  Kalyanmoy Deb,et al.  Simulated Binary Crossover for Continuous Search Space , 1995, Complex Syst..

[47]  Markus Olhofer,et al.  A Novel Transonic Fan Swept Outlet Guide Vane Using 3D Design Optimization , 2014 .

[48]  Francisco Herrera,et al.  A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms , 2011, Swarm Evol. Comput..

[49]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[50]  Anne Auger,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions , 2009 .

[51]  Peter Rossmanith,et al.  Simulated Annealing , 2008, Taschenbuch der Algorithmen.

[52]  Kathryn A. Dowsland,et al.  Simulated Annealing , 1989, Encyclopedia of GIS.

[53]  Yaochu Jin,et al.  Automated Selection of Evolutionary Multi-objective Optimization Algorithms , 2019, 2019 IEEE Symposium Series on Computational Intelligence (SSCI).

[54]  Antonio LaTorre,et al.  Multiple Offspring Sampling in Large Scale Global Optimization , 2012, 2012 IEEE Congress on Evolutionary Computation.

[55]  Xin Yao,et al.  Evolutionary programming made faster , 1999, IEEE Trans. Evol. Comput..

[56]  Ponnuthurai Nagaratnam Suganthan,et al.  Benchmark Functions for the CEC'2013 Special Session and Competition on Large-Scale Global Optimization , 2008 .

[57]  Lukás Burget,et al.  Recurrent neural network based language model , 2010, INTERSPEECH.

[58]  Xin Yao,et al.  A Survey of Automatic Parameter Tuning Methods for Metaheuristics , 2020, IEEE Transactions on Evolutionary Computation.

[59]  Raquel Urtasun,et al.  Graph HyperNetworks for Neural Architecture Search , 2018, ICLR.

[60]  M. Stein Large sample properties of simulations using latin hypercube sampling , 1987 .

[61]  Heike Trautmann,et al.  Automated Algorithm Selection: Survey and Perspectives , 2018, Evolutionary Computation.

[62]  Mert Gürbüzbalaban,et al.  On Nesterov's Nonsmooth Chebyshev-Rosenbrock Functions , 2012 .

[63]  Qingfu Zhang,et al.  The performance of a new version of MOEA/D on CEC09 unconstrained MOP test instances , 2009, 2009 IEEE Congress on Evolutionary Computation.

[64]  Qingfu Zhang,et al.  Multiobjective evolutionary algorithms: A survey of the state of the art , 2011, Swarm Evol. Comput..

[65]  Bilel Derbel,et al.  Landscape-Aware Performance Prediction for Evolutionary Multiobjective Optimization , 2020, IEEE Transactions on Evolutionary Computation.

[66]  Tobias Rodemann Industrial Portfolio Management for Many-Objective Optimization Algorithms , 2018, 2018 IEEE Congress on Evolutionary Computation (CEC).

[67]  Yaochu Jin,et al.  Optimization of micro heat exchanger: CFD, analytical approach and multi-objective evolutionary algorithms , 2006 .

[68]  Heike Trautmann,et al.  Leveraging TSP Solver Complementarity through Machine Learning , 2018, Evolutionary Computation.

[69]  John H. Holland,et al.  Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence , 1992 .

[70]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[71]  Seth D. Guikema,et al.  A derivation of the number of minima of the Griewank function , 2008, Appl. Math. Comput..

[72]  Li Li,et al.  Adaptive recommendation model using meta-learning for population-based algorithms , 2019, Inf. Sci..

[73]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.