AdvantageNAS: Efficient Neural Architecture Search with Credit Assignment

Authors

  • Rei Sato University of Tsukuba RIKEN AIP
  • Jun Sakuma University of Tsukuba RIKEN AIP
  • Youhei Akimoto University of Tsukuba RIKEN AIP

DOI:

https://doi.org/10.1609/aaai.v35i11.17143

Keywords:

Hyperparameter Tuning / Algorithm Configuration, (Deep) Neural Network Algorithms, Optimization, Algorithm Configuration

Abstract

Neural architecture search (NAS) is an approach for automatically designing a neural network architecture without human effort or expert knowledge. However, the high computational cost of NAS limits its use in commercial applications. Two recent NAS paradigms, namely one-shot and sparse propagation, which reduce the time and space complexities, respectively, provide clues for solving this problem. In this paper, we propose a novel search strategy for one-shot and sparse propagation NAS, namely AdvantageNAS, which further reduces the time complexity of NAS by reducing the number of search iterations. AdvantageNAS is a gradient-based approach that improves the search efficiency by introducing credit assignment in gradient estimation for architecture updates. Experiments on the NAS-Bench-201 and PTB dataset show that AdvantageNAS discovers an architecture with higher performance under a limited time budget compared to existing sparse propagation NAS. To further reveal the reliabilities of AdvantageNAS, we investigate it theoretically and find that it monotonically improves the expected loss and thus converges.

Downloads

Published

2021-05-18

How to Cite

Sato, R., Sakuma, J., & Akimoto, Y. (2021). AdvantageNAS: Efficient Neural Architecture Search with Credit Assignment. Proceedings of the AAAI Conference on Artificial Intelligence, 35(11), 9489-9496. https://doi.org/10.1609/aaai.v35i11.17143

Issue

Section

AAAI Technical Track on Machine Learning IV