Initializing Bayesian Hyperparameter Optimization via Meta-Learning

Authors

  • Matthias Feurer University of Freiburg
  • Jost Springenberg University of Freiburg
  • Frank Hutter University of Freiburg

DOI:

https://doi.org/10.1609/aaai.v29i1.9354

Keywords:

Machine Learning, Meta-Learning, Bayesian Optimization, Hyperparameter Optimization, Sequential Model-based Optimization

Abstract

Model selection and hyperparameter optimization is crucial in applying machine learning to a novel dataset. Recently, a subcommunity of machine learning has focused on solving this problem with Sequential Model-based Bayesian Optimization (SMBO), demonstrating substantial successes in many applications. However, for computationally expensive algorithms the overhead of hyperparameter optimization can still be prohibitive. In this paper we mimic a strategy human domain experts use: speed up optimization by starting from promising configurations that performed well on similar datasets. The resulting initialization technique integrates naturally into the generic SMBO framework and can be trivially applied to any SMBO method. To validate our approach, we perform extensive experiments with two established SMBO frameworks (Spearmint and SMAC) with complementary strengths; optimizing two machine learning frameworks on 57 datasets. Our initialization procedure yields mild improvements for low-dimensional hyperparameter optimization and substantially improves the state of the art for the more complex combined algorithm selection and hyperparameter optimization problem.

Downloads

Published

2015-02-16

How to Cite

Feurer, M., Springenberg, J., & Hutter, F. (2015). Initializing Bayesian Hyperparameter Optimization via Meta-Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 29(1). https://doi.org/10.1609/aaai.v29i1.9354

Issue

Section

AAAI Technical Track: Heuristic Search and Optimization