ExperienceThinking: Hyperparameter Optimization with Budget Constraints

The problem of hyperparameter optimization exists widely in the real life and many common tasks can be transformed into it, such as neural architecture search and feature subset selection. Without considering various constraints, the existing hyperparameter tuning techniques can solve these problems effectively by traversing as many hyperparameter configurations as possible. However, because of the limited resources and budget, it is not feasible to evaluate so many kinds of configurations, which requires us to design effective algorithms to find a best possible hyperparameter configuration with a finite number of configuration evaluations. In this paper, we simulate human thinking processes and combine the merit of the existing techniques, and thus propose a new algorithm called ExperienceThinking, trying to solve this constrained hyperparameter optimization problem. In addition, we analyze the performances of 3 classical hyperparameter optimization algorithms with a finite number of configuration evaluations, and compare with that of ExperienceThinking. The experimental results show that our proposed algorithm provides superior results and has better performance.

[1]  Tianqi Chen,et al.  XGBoost: A Scalable Tree Boosting System , 2016, KDD.

[2]  Svetha Venkatesh,et al.  Batch Bayesian optimization using multi-scale search , 2020, Knowl. Based Syst..

[3]  Kirthevasan Kandasamy,et al.  Neural Architecture Search with Bayesian Optimisation and Optimal Transport , 2018, NeurIPS.

[4]  Donald F. Specht,et al.  A general regression neural network , 1991, IEEE Trans. Neural Networks.

[5]  Gaofeng Meng,et al.  RENAS: Reinforced Evolutionary Neural Architecture Search , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[6]  Andranik S. Akopov,et al.  Parallel multi-agent real-coded genetic algorithm for large-scale black-box single-objective optimisation , 2019, Knowl. Based Syst..

[7]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[8]  Yong Wang,et al.  Global and Local Surrogate-Assisted Differential Evolution for Expensive Constrained Optimization Problems With Inequality Constraints , 2019, IEEE Transactions on Cybernetics.

[9]  Andreas Krause,et al.  Efficient High Dimensional Bayesian Optimization with Additivity and Quadrature Fourier Features , 2018, NeurIPS.

[10]  Xingsheng Gu,et al.  Improved particle swarm optimization algorithm based novel encoding and decoding schemes for flexible job shop scheduling problem , 2020, Comput. Oper. Res..

[11]  Nando de Freitas,et al.  Taking the Human Out of the Loop: A Review of Bayesian Optimization , 2016, Proceedings of the IEEE.

[12]  Goldberg,et al.  Genetic algorithms , 1993, Robust Control Systems with Genetic Algorithms.

[13]  David D. Cox,et al.  Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures , 2013, ICML.

[14]  Kevin Leyton-Brown,et al.  Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms , 2012, KDD.

[15]  Ang Li,et al.  A Generalized Framework for Population Based Training , 2019, KDD.

[16]  Ameet Talwalkar,et al.  Hyperband: Bandit-Based Configuration Evaluation for Hyperparameter Optimization , 2016, ICLR.

[17]  Yoshua Bengio,et al.  Random Search for Hyper-Parameter Optimization , 2012, J. Mach. Learn. Res..

[18]  Frank Hutter,et al.  Initializing Bayesian Hyperparameter Optimization via Meta-Learning , 2015, AAAI.

[19]  Jiaxu Cui,et al.  Deep Neural Architecture Search with Deep Graph Bayesian Optimization , 2019, 2019 IEEE/WIC/ACM International Conference on Web Intelligence (WI).

[20]  Aaron Klein,et al.  Bayesian Optimization with Robust Bayesian Neural Networks , 2016, NIPS.

[21]  Zhitao Liu,et al.  A Fast Grid Search Method in Support Vector Regression Forecasting Time Series , 2006, IDEAL.

[22]  Max Welling,et al.  BOCK : Bayesian Optimization with Cylindrical Kernels , 2018, ICML.

[23]  Youngjun Yoo,et al.  Hyperparameter optimization of deep neural network using univariate dynamic encoding algorithm for searches , 2019, Knowl. Based Syst..

[24]  Kenneth O. Stanley,et al.  Deep Neuroevolution: Genetic Algorithms Are a Competitive Alternative for Training Deep Neural Networks for Reinforcement Learning , 2017, ArXiv.

[25]  Yan Xu,et al.  Autotune: A Derivative-free Optimization Framework for Hyperparameter Tuning , 2018, KDD.

[26]  Thomas Stützle,et al.  Automatic Algorithm Configuration Based on Local Search , 2007, AAAI.

[27]  Wei Pan,et al.  BayesNAS: A Bayesian Approach for Neural Architecture Search , 2019, ICML.

[28]  V. S. Shankar Sriram,et al.  An efficient intrusion detection system based on hypergraph - Genetic algorithm for parameter optimization and feature selection in support vector machine , 2017, Knowl. Based Syst..

[29]  D. Sculley,et al.  Google Vizier: A Service for Black-Box Optimization , 2017, KDD.

[30]  Aaron Klein,et al.  Towards Automatically-Tuned Neural Networks , 2016, AutoML@ICML.

[31]  Indrajit N. Trivedi,et al.  Optimization of problems with multiple objectives using the multi-verse optimization algorithm , 2017, Knowl. Based Syst..

[32]  Bin Xu,et al.  An Improved Particle Swarm Optimization with Biogeography-Based Learning Strategy for Economic Dispatch Problems , 2018, Complex..

[33]  Xia Li,et al.  A dynamic surrogate-assisted evolutionary algorithm framework for expensive structural optimization , 2020 .

[34]  Karol R. Opara,et al.  Differential Evolution: A survey of theoretical analyses , 2019, Swarm Evol. Comput..

[35]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[36]  Fabrizio Maria Maggi,et al.  Genetic algorithms for hyperparameter optimization in predictive business process monitoring , 2018, Inf. Syst..

[37]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[38]  Jasper Snoek,et al.  Practical Bayesian Optimization of Machine Learning Algorithms , 2012, NIPS.

[39]  Lars Schmidt-Thieme,et al.  Two-Stage Transfer Surrogate Model for Automatic Hyperparameter Optimization , 2016, ECML/PKDD.