NAS-LID: Efficient Neural Architecture Search with Local Intrinsic Dimension

Authors

  • Xin He Hong Kong Baptist University NVIDIA AI Tech Center
  • Jiangchao Yao Shanghai Jiao Tong University Shanghai AI Laboratory
  • Yuxin Wang Hong Kong Baptist University
  • Zhenheng Tang Hong Kong Baptist University
  • Ka Chun Cheung Hong Kong Baptist University NVIDIA AI Tech Centre
  • Simon See Shanghai Jiao Tong University NVIDIA AI Tech Centre Mahindra University Coventry University
  • Bo Han Hong Kong Baptist University
  • Xiaowen Chu The Hong Kong University of Science and Technology (Guangzhou) Hong Kong Baptist University

DOI:

https://doi.org/10.1609/aaai.v37i6.25949

Keywords:

ML: Auto ML and Hyperparameter Tuning, ML: Classification and Regression, ML: Deep Neural Architectures

Abstract

One-shot neural architecture search (NAS) substantially improves the search efficiency by training one supernet to estimate the performance of every possible child architecture (i.e., subnet). However, the inconsistency of characteristics among subnets incurs serious interference in the optimization, resulting in poor performance ranking correlation of subnets. Subsequent explorations decompose supernet weights via a particular criterion, e.g., gradient matching, to reduce the interference; yet they suffer from huge computational cost and low space separability. In this work, we propose a lightweight and effective local intrinsic dimension (LID)-based method NAS-LID. NAS-LID evaluates the geometrical properties of architectures by calculating the low-cost LID features layer-by-layer, and the similarity characterized by LID enjoys better separability compared with gradients, which thus effectively reduces the interference among subnets. Extensive experiments on NASBench-201 indicate that NAS-LID achieves superior performance with better efficiency. Specifically, compared to the gradient-driven method, NAS-LID can save up to 86% of GPU memory overhead when searching on NASBench-201. We also demonstrate the effectiveness of NAS-LID on ProxylessNAS and OFA spaces. Source code:https://github.com/marsggbo/NAS-LID.

Downloads

Published

2023-06-26

How to Cite

He, X., Yao, J., Wang, Y., Tang, Z., Cheung, K. C., See, S., Han, B., & Chu, X. (2023). NAS-LID: Efficient Neural Architecture Search with Local Intrinsic Dimension. Proceedings of the AAAI Conference on Artificial Intelligence, 37(6), 7839-7847. https://doi.org/10.1609/aaai.v37i6.25949

Issue

Section

AAAI Technical Track on Machine Learning I