Bandit Limited Discrepancy Search and Application to Machine Learning Pipeline Optimization

Authors

  • Akihiro Kishimoto IBM Research
  • Djallel Bouneffouf IBM Research
  • Radu Marinescu IBM Research
  • Parikshit Ram IBM Research
  • Ambrish Rawat IBM Research
  • Martin Wistuba Amazon Research
  • Paulito Palmes IBM Research
  • Adi Botea Eaton

DOI:

https://doi.org/10.1609/aaai.v36i9.21263

Keywords:

Search And Optimization (SO)

Abstract

Optimizing a machine learning (ML) pipeline has been an important topic of AI and ML. Despite recent progress, pipeline optimization remains a challenging problem, due to potentially many combinations to consider as well as slow training and validation. We present the BLDS algorithm for optimized algorithm selection (ML operations) in a fixed ML pipeline structure. BLDS performs multi-fidelity optimization for selecting ML algorithms trained with smaller computational overhead, while controlling its pipeline search based on multi-armed bandit and limited discrepancy search. Our experiments on well-known classification benchmarks show that BLDS is superior to competing algorithms. We also combine BLDS with hyperparameter optimization, empirically showing the advantage of BLDS.

Downloads

Published

2022-06-28

How to Cite

Kishimoto, A., Bouneffouf, D., Marinescu, R., Ram, P., Rawat, A., Wistuba, M., Palmes, P., & Botea, A. (2022). Bandit Limited Discrepancy Search and Application to Machine Learning Pipeline Optimization. Proceedings of the AAAI Conference on Artificial Intelligence, 36(9), 10228-10237. https://doi.org/10.1609/aaai.v36i9.21263

Issue

Section

AAAI Technical Track on Search and Optimization