Posterior-Guided Neural Architecture Search


  • Yizhou Zhou University of Science and Technology of China
  • Xiaoyan Sun Microsoft
  • Chong Luo Microsoft
  • Zheng-Jun Zha University of Science and Technology of China
  • Wenjun Zeng Microsoft



The emergence of neural architecture search (NAS) has greatly advanced the research on network design. Recent proposals such as gradient-based methods or one-shot approaches significantly boost the efficiency of NAS. In this paper, we formulate the NAS problem from a Bayesian perspective. We propose explicitly estimating the joint posterior distribution over pairs of network architecture and weights. Accordingly, a hybrid network representation is presented which enables us to leverage the Variational Dropout so that the approximation of the posterior distribution becomes fully gradient-based and highly efficient. A posterior-guided sampling method is then presented to sample architecture candidates and directly make evaluations. As a Bayesian approach, our posterior-guided NAS (PGNAS) avoids tuning a number of hyper-parameters and enables a very effective architecture sampling in posterior probability space. Interestingly, it also leads to a deeper insight into the weight sharing used in the one-shot NAS and naturally alleviates the mismatch between the sampled architecture and weights caused by the weight sharing. We validate our PGNAS method on the fundamental image classification task. Results on Cifar-10, Cifar-100 and ImageNet show that PGNAS achieves a good trade-off between precision and speed of search among NAS methods. For example, it takes 11 GPU days to search a very competitive architecture with 1.98% and 14.28% test errors on Cifar10 and Cifar100, respectively.




How to Cite

Zhou, Y., Sun, X., Luo, C., Zha, Z.-J., & Zeng, W. (2020). Posterior-Guided Neural Architecture Search. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 6973-6980.



AAAI Technical Track: Machine Learning