Efficient Neural Architecture Search via Proximal Iterations

Authors

  • Quanming Yao 4Paradigm
  • Ju Xu Peking University
  • Wei-Wei Tu 4Paradigm
  • Zhanxing Zhu Peking University

DOI:

https://doi.org/10.1609/aaai.v34i04.6143

Abstract

Neural architecture search (NAS) attracts much research attention because of its ability to identify better architectures than handcrafted ones. Recently, differentiable search methods become the state-of-the-arts on NAS, which can obtain high-performance architectures in several days. However, they still suffer from huge computation costs and inferior performance due to the construction of the supernet. In this paper, we propose an efficient NAS method based on proximal iterations (denoted as NASP). Different from previous works, NASP reformulates the search process as an optimization problem with a discrete constraint on architectures and a regularizer on model complexity. As the new objective is hard to solve, we further propose an efficient algorithm inspired by proximal iterations for optimization. In this way, NASP is not only much faster than existing differentiable search methods, but also can find better architectures and balance the model complexity. Finally, extensive experiments on various tasks demonstrate that NASP can obtain high-performance architectures with more than 10 times speedup over the state-of-the-arts.

Downloads

Published

2020-04-03

How to Cite

Yao, Q., Xu, J., Tu, W.-W., & Zhu, Z. (2020). Efficient Neural Architecture Search via Proximal Iterations. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 6664-6671. https://doi.org/10.1609/aaai.v34i04.6143

Issue

Section

AAAI Technical Track: Machine Learning