Sequential Learnable Evolutionary Algorithm: A Research Program

Evolutionary algorithms are typically run several times in design optimization problems and the best solution taken. We propose a novel online algorithm selection framework that learns to use the best algorithm based on previous runs, hence in effect using different and better algorithms as the search progresses. First, a set of algorithms are run on a benchmark problem suite. Given a new problem, a default algorithm is run and its convergence characteristics are recorded. This is used to map to the problem database to find the most similar problem. In turn, the database returns the best algorithm for this problem and this algorithm is run in the second iteration and so on, aiming to home onto the most suitable algorithm for the problem. The resulting algorithm, named Sequential Learnable Evolutionary algorithm (SLEA), outperforms Covariance Matrix Adaptation Evolution Strategy (CMA-ES) with multi-restarts. SLEA is also applied to a new problem, a real world application, and learns its characteristics. Experimental results show that it can correctly select the best algorithm for the problem. Finally, this paper proposes a new research program which learns the algorithm-problem mapping through solving real world problems accessed through the web and worldwide cooperation through Wikipedia.

[1]  John R. Rice,et al.  The Algorithm Selection Problem , 1976, Adv. Comput..

[2]  Yew-Soon Ong,et al.  A Probabilistic Memetic Framework , 2009, IEEE Transactions on Evolutionary Computation.

[3]  Kenneth Sörensen,et al.  Metaheuristics - the metaphor exposed , 2015, Int. Trans. Oper. Res..

[4]  Dervis Karaboga,et al.  A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm , 2007, J. Glob. Optim..

[5]  Shiu Yin Yuen,et al.  Which algorithm should i choose at any point of the search: an evolutionary portfolio approach , 2013, GECCO '13.

[6]  Alex Fukunaga,et al.  Genetic algorithm portfolios , 2000, Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512).

[7]  P. N. Suganthan,et al.  Differential Evolution Algorithm With Strategy Adaptation for Global Numerical Optimization , 2009, IEEE Transactions on Evolutionary Computation.

[8]  Anne Auger,et al.  CMA-ES: evolution strategies and covariance matrix adaptation , 2011, GECCO.

[9]  Fei Peng,et al.  Population-Based Algorithm Portfolios for Numerical Optimization , 2010, IEEE Transactions on Evolutionary Computation.

[10]  Francisco J. Rodríguez,et al.  Arbitrary function optimisation with metaheuristics , 2012, Soft Comput..

[11]  Mario A. Muñoz,et al.  The Algorithm Selection Problem on the Continuous Optimization Domain , 2013 .

[12]  Nikolaos V. Sahinidis,et al.  Derivative-free optimization: a review of algorithms and comparison of software implementations , 2013, J. Glob. Optim..

[13]  Bruce A. Robinson,et al.  Self-Adaptive Multimethod Search for Global Optimization in Real-Parameter Spaces , 2009, IEEE Transactions on Evolutionary Computation.

[14]  Zbigniew Michalewicz,et al.  Parameter Control in Evolutionary Algorithms , 2007, Parameter Setting in Evolutionary Algorithms.

[15]  Michel Gendreau,et al.  Hyper-heuristics: a survey of the state of the art , 2013, J. Oper. Res. Soc..

[16]  Jürgen Schmidhuber,et al.  Adaptive Online Time Allocation to Search Algorithms , 2004, ECML.

[17]  Shiu Yin Yuen,et al.  On composing an (evolutionary) algorithm portfolio , 2013, GECCO '13 Companion.

[18]  Xin Zhang,et al.  A novel artificial bee colony algorithm for HVAC optimization problems , 2013 .

[19]  Kate Smith-Miles,et al.  Towards objective measures of algorithm performance across instance space , 2014, Comput. Oper. Res..

[20]  Graham J. Williams,et al.  Big Data Opportunities and Challenges: Discussions from Data Analytics Perspectives [Discussion Forum] , 2014, IEEE Computational Intelligence Magazine.

[21]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[22]  Nikolaus Hansen,et al.  Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed , 2009, GECCO '09.

[23]  Gary J. Koehler,et al.  Conditions that Obviate the No-Free-Lunch Theorems for Optimization , 2007, INFORMS J. Comput..

[24]  Zbigniew Michalewicz,et al.  Quo Vadis, Evolutionary Computation? - On a Growing Gap between Theory and Practice , 2012, WCCI.

[25]  K. F. Fong,et al.  HVAC system optimization for energy management by evolutionary programming , 2006 .

[26]  Jasper A Vrugt,et al.  Improved evolutionary optimization from genetically adaptive multimethod search , 2007, Proceedings of the National Academy of Sciences.

[27]  Andries Petrus Engelbrecht,et al.  Investigating the impact of alternative evolutionary selection strategies on multi-method global optimization , 2011, 2011 IEEE Congress of Evolutionary Computation (CEC).

[28]  Qingfu Zhang,et al.  Differential Evolution With Composite Trial Vector Generation Strategies and Control Parameters , 2011, IEEE Transactions on Evolutionary Computation.

[29]  Georgios C. Anagnostopoulos,et al.  Online model racing based on extreme performance , 2014, GECCO.

[30]  Xin Yao,et al.  Population-based Algorithm Portfolios with automated constituent algorithms selection , 2014, Inf. Sci..