TY - JOUR AU - Hu, Yi-Qi AU - Yu, Yang AU - Tu, Wei-Wei AU - Yang, Qiang AU - Chen, Yuqiang AU - Dai, Wenyuan PY - 2019/07/17 Y2 - 2024/03/29 TI - Multi-Fidelity Automatic Hyper-Parameter Tuning via Transfer Series Expansion JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 33 IS - 01 SE - AAAI Technical Track: Machine Learning DO - 10.1609/aaai.v33i01.33013846 UR - https://ojs.aaai.org/index.php/AAAI/article/view/4272 SP - 3846-3853 AB - <p>Automatic machine learning (AutoML) aims at automatically choosing the best configuration for machine learning tasks. However, a configuration evaluation can be very time consuming particularly on learning tasks with large datasets. This limitation usually restrains derivative-free optimization from releasing its full power for a fine configuration search using many evaluations. To alleviate this limitation, in this paper, we propose a derivative-free optimization framework for AutoML using multi-fidelity evaluations. It uses many lowfidelity evaluations on small data subsets and very few highfidelity evaluations on the full dataset. However, the lowfidelity evaluations can be badly biased, and need to be corrected with only a very low cost. We thus propose the <em>Transfer Series Expansion (TSE)</em> that learns the low-fidelity correction predictor efficiently by linearly combining a set of base predictors. The base predictors can be obtained cheaply from down-scaled and experienced tasks. Experimental results on real-world AutoML problems verify that the proposed framework can accelerate derivative-free configuration search significantly by making use of the multi-fidelity evaluations.</p> ER -