Sequential Parameter Optimization (SPO) and the role of tuning in experimental analysis
暂无分享,去创建一个
Experimental research in Evolutionary Computation (EC) has often been regarded as inferior to theory due to a certain arbitrariness of test set, parameter, and measuring choices. SPO integrates several state of the art statistical techniques, e.g., Design and Analysis of Computer Experiments (DACE), and connects them with objectives formulated by the 'new experimentalism' movement, thus tasking at enhancing the standing of experimentation with heuristic optimization algorithms. We review past and present development of experimental research in EC and then discuss possible bene ts of parameter tuning: a) fair comparison, b) experimental algorithm analysis, and c) adaptation of optimization algorithms to speci c (real-world) problems. SPO's basic working mechanisms are introduced, also hinting to open questions, e.g., algorithm run budget size, performance measures, coping with nondeterminism, etc. Based on some example applications, we discuss the role of parameters and tuning methods for adaptability of algorithms to problems, an important but currently underestimated property of optimization techniques.
[1] Margaret J. Robertson,et al. Design and Analysis of Experiments , 2006, Handbook of statistics.
[2] Thomas Bartz-Beielstein,et al. Experimental Research in Evolutionary Computation - The New Experimentalism , 2010, Natural Computing Series.