Warm Starting CMA-ES for Hyperparameter Optimization

Authors

  • Masahiro Nomura CyberAgent, Inc. Artificial Intelligence Research Center, AIST
  • Shuhei Watanabe University of Freiburg
  • Youhei Akimoto University of Tsukuba RIKEN Center for Advanced Intelligence Project
  • Yoshihiko Ozaki Artificial Intelligence Research Center, AIST GREE, Inc.
  • Masaki Onishi Artificial Intelligence Research Center, AIST

DOI:

https://doi.org/10.1609/aaai.v35i10.17109

Keywords:

Transfer/Adaptation/Multi-task/Meta/Automated Learning

Abstract

Hyperparameter optimization (HPO), formulated as black-box optimization (BBO), is recognized as essential for automation and high performance of machine learning approaches. The CMA-ES is a promising BBO approach with a high degree of parallelism, and has been applied to HPO tasks, often under parallel implementation, and shown superior performance to other approaches including Bayesian optimization (BO). However, if the budget of hyperparameter evaluations is severely limited, which is often the case for end users who do not deserve parallel computing, the CMA-ES exhausts the budget without improving the performance due to its long adaptation phase, resulting in being outperformed by BO approaches. To address this issue, we propose to transfer prior knowledge on similar HPO tasks through the initialization of the CMA-ES, leading to significantly shortening the adaptation time. The knowledge transfer is designed based on the novel definition of task similarity, with which the correlation of the performance of the proposed approach is confirmed on synthetic problems. The proposed warm starting CMA-ES, called WS-CMA-ES, is applied to different HPO tasks where some prior knowledge is available, showing its superior performance over the original CMA-ES as well as BO approaches with or without using the prior knowledge.

Downloads

Published

2021-05-18

How to Cite

Nomura, M., Watanabe, S., Akimoto, Y., Ozaki, Y., & Onishi, M. (2021). Warm Starting CMA-ES for Hyperparameter Optimization. Proceedings of the AAAI Conference on Artificial Intelligence, 35(10), 9188-9196. https://doi.org/10.1609/aaai.v35i10.17109

Issue

Section

AAAI Technical Track on Machine Learning III