Efficient Parameter Importance Analysis via Ablation with Surrogates


  • Andre Biedenkapp University of Freiburg
  • Marius Lindauer University of Freiburg
  • Katharina Eggensperger University of Freiburg
  • Frank Hutter University of Freiburg
  • Chris Fawcett University of British Columbia
  • Holger Hoos University of British Columbia




Algorithm Configuration, Parameter Importance, Performance Prediction


To achieve peak performance, it is often necessary to adjust the parameters of a given algorithm to the class of problem instances to be solved; this is known to be the case for popular solvers for a broad range of AI problems, including AI planning, propositional satisfiability (SAT) and answer set programming (ASP). To avoid tedious and often highly sub-optimal manual tuning of such parameters by means of ad-hoc methods, general-purpose algorithm configuration procedures can be used to automatically find performance-optimizing parameter settings. While impressive performance gains are often achieved in this manner, additional, potentially costly parameter importance analysis is required to gain insights into what parameter changes are most responsible for those improvements. Here, we show how the running time cost of ablation analysis, a well-known general-purpose approach for assessing parameter importance, can be reduced substantially by using regression models of algorithm performance constructed from data collected during the configuration process. In our experiments, we demonstrate speed-up factors between 33 and 14 727 for ablation analysis on various configuration scenarios from AI planning, SAT, ASP and mixed integer programming (MIP).




How to Cite

Biedenkapp, A., Lindauer, M., Eggensperger, K., Hutter, F., Fawcett, C., & Hoos, H. (2017). Efficient Parameter Importance Analysis via Ablation with Surrogates. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10657



AAAI Technical Track: Heuristic Search and Optimization