TY - JOUR AU - Cheng, Hsin-Pai AU - Zhang, Tunhou AU - Zhang, Yixing AU - Li, Shiyu AU - Liang, Feng AU - Yan, Feng AU - Li, Meng AU - Chandra, Vikas AU - Li, Hai AU - Chen, Yiran PY - 2021/05/18 Y2 - 2024/03/28 TI - NASGEM: Neural Architecture Search via Graph Embedding Method JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 35 IS - 8 SE - AAAI Technical Track on Machine Learning I DO - 10.1609/aaai.v35i8.16872 UR - https://ojs.aaai.org/index.php/AAAI/article/view/16872 SP - 7090-7098 AB - Neural Architecture Search (NAS) automates and prospers the design of neural networks. Estimator-based NAS has been proposed recently to model the relationship between architectures and their performance to enable scalable and flexible search. However, existing estimator-based methods encode the architecture into a latent space without considering graph similarity. Ignoring graph similarity in node-based search space may induce a large inconsistency between similar graphs and their distance in the continuous encoding space, leading to inaccurate encoding representation and/or reduced representation capacity that can yield sub-optimal search results. To preserve graph correlation information in encoding, we propose NASGEM which stands for Neural Architecture Search via Graph Embedding Method. NASGEM is driven by a novel graph embedding method equipped with similarity measures to capture the graph topology information. By precisely estimating the graph distance and using an auxiliary Weisfeiler-Lehman kernel to guide the encoding, NASGEM can utilize additional structural information to get more accurate graph representation to improve the search efficiency. GEMNet, a set of networks discovered by NASGEM, consistently outperforms networks crafted by existing search methods in classification tasks, i.e., with 0.4%-3.6% higher accuracy while having 11%- 21% fewer Multiply-Accumulates. We further transfer GEMNet for COCO object detection. In both one-stage and twostage detectors, our GEMNet surpasses its manually-crafted and automatically-searched counterparts. ER -