Efficient Parameter Importance Analysis via Ablation with Surrogates

To achieve peak performance, it is often necessary to adjust the parameters of a given algorithm to the class of problem instances to be solved; this is known to be the case for popular solvers for a broad range of AI problems, including AI planning, propositional satisfiability (SAT) and answer set programming (ASP). To avoid tedious and often highly sub-optimal manual tuning of such parameters by means of ad-hoc methods, general-purpose algorithm configuration procedures can be used to automatically find performance-optimizing parameter settings. While impressive performance gains are often achieved in this manner, additional, potentially costly parameter importance analysis is required to gain insights into what parameter changes are most responsible for those improvements. Here, we show how the running time cost of ablation analysis, a wellknown general-purpose approach for assessing parameter importance, can be reduced substantially by using regression models of algorithm performance constructed from data collected during the configuration process. In our experiments, we demonstrate speed-up factors between 33 and 14 727 for ablation analysis on various configuration scenarios from AI planning, SAT, ASP and mixed integer programming (MIP).

[1]  Lars Kotthoff,et al.  Algorithm Selection for Combinatorial Search Problems: A Survey , 2012, AI Mag..

[2]  Kevin Leyton-Brown,et al.  Automated Configuration of Mixed Integer Programming Solvers , 2010, CPAIOR.

[3]  M. Helmert,et al.  FD-Autotune: Domain-Specific Configuration using Fast Downward , 2011 .

[4]  Martin Gebser,et al.  Conflict-driven answer set solving: From theory to practice , 2012, Artif. Intell..

[5]  Yuri Malitsky,et al.  Deep Learning for Algorithm Portfolios , 2016, AAAI.

[6]  Kevin Leyton-Brown,et al.  Improved Features for Runtime Prediction of Domain-Independent Planners , 2014, ICAPS.

[7]  G. J. Hahn,et al.  A Simple Method for Regression Analysis With Censored Data , 1979 .

[8]  Yoav Shoham,et al.  Empirical hardness models: Methodology and a case study on combinatorial auctions , 2009, JACM.

[9]  Ivan Serina,et al.  Planning Through Stochastic Local Search and Temporal Action Graphs in LPG , 2003, J. Artif. Intell. Res..

[10]  Kevin Leyton-Brown,et al.  An Efficient Approach for Assessing Hyperparameter Importance , 2014, ICML.

[11]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[12]  Kevin Leyton-Brown,et al.  Algorithm Runtime Prediction: Methods and Evaluation (Extended Abstract) , 2015, IJCAI.

[13]  Sven Apel,et al.  Performance-influence models for highly configurable systems , 2015, ESEC/SIGSOFT FSE.

[14]  Ashish Sabharwal,et al.  An Empirical Study of Optimization for Maximizing Diffusion in Networks , 2010, CP.

[15]  F. Hutter,et al.  Hydra-MIP : Automated Algorithm Configuration and Selection for Mixed Integer Programming , 2011 .

[16]  Nando de Freitas,et al.  Taking the Human Out of the Loop: A Review of Bayesian Optimization , 2016, Proceedings of the IEEE.

[17]  Thomas Stützle,et al.  AClib: A Benchmark Library for Algorithm Configuration , 2014, LION.

[18]  Marius Thomas Lindauer,et al.  The Configurable SAT Solver Challenge (CSSC) , 2015, Artif. Intell..

[19]  Yuri Malitsky,et al.  Model-Based Genetic Algorithms for Algorithm Configuration , 2015, IJCAI.

[20]  Kevin Leyton-Brown,et al.  Bayesian Optimization With Censored Response Data , 2013, ArXiv.

[21]  Leslie Pérez Cáceres,et al.  The irace package: Iterated racing for automatic algorithm configuration , 2016 .

[22]  Thomas Stützle,et al.  A Racing Algorithm for Configuring Metaheuristics , 2002, GECCO.

[23]  Kevin Leyton-Brown,et al.  Efficient Benchmarking of Hyperparameter Optimizers via Surrogates , 2015, AAAI.

[24]  Marius Thomas Lindauer,et al.  SpySMAC: Automated Configuration and Performance Analysis of SAT Solvers , 2015, SAT.

[25]  M. Lindauer,et al.  Surviving Solver Sensitivity: An ASP Practitioner's Guide , 2012, ICLP.

[26]  Frank M. Hutter SPEAR Theorem Prover , 2007 .

[27]  Miroslaw Truszczynski,et al.  Weighted-Sequence Problem: ASP vs CASP and Declarative vs Problem-Oriented Solving , 2012, PADL.

[28]  John R. Rice,et al.  The Algorithm Selection Problem , 1976, Adv. Comput..

[29]  Kevin Leyton-Brown,et al.  SATzilla: Portfolio-based Algorithm Selection for SAT , 2008, J. Artif. Intell. Res..

[30]  Yuri Malitsky,et al.  ISAC - Instance-Specific Algorithm Configuration , 2010, ECAI.

[31]  Holger H. Hoos,et al.  Analysing differences between algorithm configurations through ablation , 2015, Journal of Heuristics.

[32]  Bart Selman,et al.  Problem Structure in the Presence of Perturbations , 1997, AAAI/IAAI.

[33]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[34]  Alan J. Hu,et al.  Structural Abstraction of Software Verification Conditions , 2007, CAV.

[35]  Marius Thomas Lindauer,et al.  A Portfolio Solver for Answer Set Programming: Preliminary Report , 2011, LPNMR.

[36]  Kevin Leyton-Brown,et al.  Identifying Key Algorithm Parameters and Instance Features Using Forward Selection , 2013, LION.