Efficient Benchmarking of Hyperparameter Optimizers via Surrogates

Authors

  • Katharina Eggensperger University of Freiburg
  • Frank Hutter University of Freiburg
  • Holger Hoos University of British Columbia
  • Kevin Leyton-Brown University of British Columbia

DOI:

https://doi.org/10.1609/aaai.v29i1.9375

Keywords:

Sequential Model-based Bayesian Optimization, Hyperparameter optimization, Exploitation of benchmarks and experimentation, Performance modeling

Abstract

Hyperparameter optimization is crucial for achieving peak performance with many machine learning algorithms; however, the evaluation of new optimization techniques on real-world hyperparameter optimization problems can be very expensive. Therefore, experiments are often performed using cheap synthetic test functions with characteristics rather different from those of real benchmarks of interest. In this work, we introduce another option: cheap-to-evaluate surrogates of real hyperparameter optimization benchmarks that share the same hyperparameter spaces and feature similar response surfaces. Specifically, we train regression models on data describing a machine learning algorithm’s performance depending on its hyperparameter setting, and then cheaply evaluate hyperparameter optimization methods using the model’s performance predictions in lieu of running the real algorithm. We evaluated a wide range of regression techniques, both in terms of how well they predict the performance of new hyperparameter settings and in terms of the quality of surrogate benchmarks obtained. We found that tree-based models capture the performance of several machine learning algorithms well and yield surrogate benchmarks that closely resemble real-world benchmarks, while being much easier to use and orders of magnitude cheaper to evaluate.

Downloads

Published

2015-02-16

How to Cite

Eggensperger, K., Hutter, F., Hoos, H., & Leyton-Brown, K. (2015). Efficient Benchmarking of Hyperparameter Optimizers via Surrogates. Proceedings of the AAAI Conference on Artificial Intelligence, 29(1). https://doi.org/10.1609/aaai.v29i1.9375

Issue

Section

AAAI Technical Track: Heuristic Search and Optimization