ProxyBO: Accelerating Neural Architecture Search via Bayesian Optimization with Zero-Cost Proxies
DOI:
https://doi.org/10.1609/aaai.v37i8.26169Keywords:
ML: Auto ML and Hyperparameter Tuning, SO: Algorithm ConfigurationAbstract
Designing neural architectures requires immense manual efforts. This has promoted the development of neural architecture search (NAS) to automate the design. While previous NAS methods achieve promising results but run slowly, zero-cost proxies run extremely fast but are less promising. Therefore, it’s of great potential to accelerate NAS via those zero-cost proxies. The existing method has two limitations, which are unforeseeable reliability and one-shot usage. To address the limitations, we present ProxyBO, an efficient Bayesian optimization (BO) framework that utilizes the zero-cost proxies to accelerate neural architecture search. We apply the generalization ability measurement to estimate the fitness of proxies on the task during each iteration and design a novel acquisition function to combine BO with zero-cost proxies based on their dynamic influence. Extensive empirical studies show that ProxyBO consistently outperforms competitive baselines on five tasks from three public benchmarks. Concretely, ProxyBO achieves up to 5.41× and 3.86× speedups over the state-of-the-art approaches REA and BRP-NAS.Downloads
Published
2023-06-26
How to Cite
Shen, Y., Li, Y., Zheng, J., Zhang, W., Yao, P., Li, J., Yang, S., Liu, J., & Cui, B. (2023). ProxyBO: Accelerating Neural Architecture Search via Bayesian Optimization with Zero-Cost Proxies. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 9792-9801. https://doi.org/10.1609/aaai.v37i8.26169
Issue
Section
AAAI Technical Track on Machine Learning III