Clustering and Ranking Based Methods for Selecting Tuned Search Heuristic Parameters

This paper presents two alternative methods for selecting tuned hyper-parameters of search heuristics for computationally expensive simulation-based optimization problems. The set of available hyper-parameters are obtained by conducting a meta-optimization to a number of mathematical test functions at different problem dimensions. Since a large number of tuned hyper-parameters is obtained from the meta-optimization, we develop methods to select tuned hyper-parameters for our actual simulation-based problems. The first selection method is based on clustering of the tuned hyper-parameters, where the medoids of the largest clusters are chosen. The second method constructs a ranking matrix that cross-checks the expected cost value for each tuned hyper-parameter configuration when solving the other training functions in the set, and that configuration is selected which has the minimal sum of ranks in the matrix. We apply our methods to custom particle swarm optimization and evolutionary algorithms. The feasibility of our approach is demonstrated by benchmarking the tuned search heuristics to a number of other search heuristics from open-source libraries on continuous building energy simulation problems. Our results show that our tuning and selection methods are successful in identifying hyper-parameters that result in competitive performance to other popular or recent algorihtms such as CMA-ES or RBFOpt. Future work should apply the methods developed to more recent search heuristics and make use of synthetic training functions in the meta-optimization that better resemble building simulation problems, instead of the mathematical test functions currently employed.

[1]  Xin Yao,et al.  Fast Evolution Strategies , 1997, Evolutionary Programming.

[2]  Leslie Pérez Cáceres,et al.  The irace package: Iterated racing for automatic algorithm configuration , 2016 .

[3]  Frank Kursawe,et al.  Grundlegende empirische Untersuchungen der Parameter von Evolutionsstrategien - Metastrategien , 1999 .

[4]  M. Ali,et al.  Some Variants of the Controlled Random Search Algorithm for Global Optimization , 2006 .

[5]  Achille Fokoue,et al.  An effective algorithm for hyperparameter optimization of neural networks , 2017, IBM J. Res. Dev..

[6]  Yoshua Bengio,et al.  Random Search for Hyper-Parameter Optimization , 2012, J. Mach. Learn. Res..

[7]  I ScottKirkpatrick Optimization by Simulated Annealing: Quantitative Studies , 1984 .

[8]  T. Rowan Functional stability analysis of numerical algorithms , 1990 .

[9]  Thomas Bäck,et al.  Contemporary Evolution Strategies , 2013, Natural Computing Series.

[10]  J. Kämpf,et al.  A comparison of global optimization algorithms with standard benchmark functions and real-world applications using EnergyPlus , 2009 .

[11]  Ralph Evins,et al.  A review of computational optimisation methods applied to sustainable building design , 2013 .

[12]  David Rutten,et al.  Galapagos: On the Logic and Limitations of Generic Solvers , 2013 .

[13]  José Neves,et al.  The fully informed particle swarm: simpler, maybe better , 2004, IEEE Transactions on Evolutionary Computation.

[14]  Petros Koumoutsakos,et al.  Reducing the Time Complexity of the Derandomized Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) , 2003, Evolutionary Computation.

[15]  Ingo Rechenberg,et al.  Evolutionsstrategie : Optimierung technischer Systeme nach Prinzipien der biologischen Evolution , 1973 .

[16]  Philippe Rigo,et al.  A review on simulation-based optimization methods applied to building performance analysis , 2014 .

[17]  P. Rousseeuw Silhouettes: a graphical aid to the interpretation and validation of cluster analysis , 1987 .

[18]  Drury B. Crawley,et al.  EnergyPlus: Energy simulation program , 2000 .

[19]  M. Friedman The Use of Ranks to Avoid the Assumption of Normality Implicit in the Analysis of Variance , 1937 .

[20]  Hans-Paul Schwefel,et al.  Evolution strategies – A comprehensive introduction , 2002, Natural Computing.

[21]  Vojislav Novakovic,et al.  Optimization of energy consumption in buildings with hydronic heating systems considering thermal comfort by use of computer-based tools , 2007 .

[22]  Will N. Browne,et al.  SILVEREYE – The Implementation of Particle Swarm Optimization Algorithm in a Design Optimization Tool , 2017 .

[23]  Christoph Waibel Simulation-Based Optimization of Buildings and Multi-Energy Systems , 2018 .

[24]  Stefan M. Wild,et al.  Benchmarking Derivative-Free Optimization Algorithms , 2009, SIAM J. Optim..

[25]  Alberto Costa,et al.  RBFOpt: an open-source library for black-box optimization with costly function evaluations , 2018, Mathematical Programming Computation.

[26]  A. E. Eiben,et al.  Parameter tuning for configuring and analyzing evolutionary algorithms , 2011, Swarm Evol. Comput..

[27]  Jorge J. Moré,et al.  Digital Object Identifier (DOI) 10.1007/s101070100263 , 2001 .

[28]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[29]  Alfonso P. Ramallo-González,et al.  Using self-adaptive optimisation methods to perform sequential optimisation for low-energy building design , 2014 .

[30]  Russell C. Eberhart,et al.  A new optimizer using particle swarm theory , 1995, MHS'95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science.

[31]  Thomas Bartz-Beielstein,et al.  Sequential parameter optimization , 2005, 2005 IEEE Congress on Evolutionary Computation.

[32]  C. D. Perttunen,et al.  Lipschitzian optimization without the Lipschitz constant , 1993 .

[33]  Jan Carmeliet,et al.  Building energy optimization: An extensive benchmark of global search algorithms , 2019, Energy and Buildings.

[34]  Jonathan A. Wright,et al.  A comparison of deterministic and probabilistic optimization algorithms for nonsmooth simulation-based optimization , 2004 .