Binarized Neural Architecture Search

Authors

  • Hanlin Chen Beihang University
  • Li'an Zhuo Beihang University
  • Baochang Zhang Beihang University
  • Xiawu Zheng Xiamen University
  • Jianzhuang Liu Shenzhen Institutes of Advanced Technology, University of Chinese Academy of Sciences
  • David Doermann University at Buffalo
  • Rongrong Ji Xiamen University

DOI:

https://doi.org/10.1609/aaai.v34i07.6624

Abstract

Neural architecture search (NAS) can have a significant impact in computer vision by automatically designing optimal neural network architectures for various tasks. A variant, binarized neural architecture search (BNAS), with a search space of binarized convolutions, can produce extremely compressed models. Unfortunately, this area remains largely unexplored. BNAS is more challenging than NAS due to the learning inefficiency caused by optimization requirements and the huge architecture space. To address these issues, we introduce channel sampling and operation space reduction into a differentiable NAS to significantly reduce the cost of searching. This is accomplished through a performance-based strategy used to abandon less potential operations. Two optimization methods for binarized neural networks are used to validate the effectiveness of our BNAS. Extensive experiments demonstrate that the proposed BNAS achieves a performance comparable to NAS on both CIFAR and ImageNet databases. An accuracy of 96.53% vs. 97.22% is achieved on the CIFAR-10 dataset, but with a significantly compressed model, and a 40% faster search than the state-of-the-art PC-DARTS.

Downloads

Published

2020-04-03

How to Cite

Chen, H., Zhuo, L., Zhang, B., Zheng, X., Liu, J., Doermann, D., & Ji, R. (2020). Binarized Neural Architecture Search. Proceedings of the AAAI Conference on Artificial Intelligence, 34(07), 10526-10533. https://doi.org/10.1609/aaai.v34i07.6624

Issue

Section

AAAI Technical Track: Vision