TY - JOUR AU - Zhang, Tunhou AU - Cheng, Hsin-Pai AU - Li, Zhenwen AU - Yan, Feng AU - Huang, Chengyu AU - Li, Hai AU - Chen, Yiran PY - 2020/04/03 Y2 - 2024/03/28 TI - AutoShrink: A Topology-Aware NAS for Discovering Efficient Neural Architecture JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 34 IS - 04 SE - AAAI Technical Track: Machine Learning DO - 10.1609/aaai.v34i04.6163 UR - https://ojs.aaai.org/index.php/AAAI/article/view/6163 SP - 6829-6836 AB - <p>Resource is an important constraint when deploying Deep Neural Networks (DNNs) on mobile and edge devices. Existing works commonly adopt the cell-based search approach, which limits the flexibility of network patterns in learned cell structures. Moreover, due to the topology-agnostic nature of existing works, including both cell-based and node-based approaches, the search process is time consuming and the performance of found architecture may be sub-optimal. To address these problems, we propose <em>AutoShrink</em>, a topology-aware Neural Architecture Search (NAS) for searching efficient building blocks of neural architectures. Our method is node-based and thus can learn flexible network patterns in cell structures within a topological search space. Directed Acyclic Graphs (DAGs) are used to abstract DNN architectures and progressively optimize the cell structure through edge shrinking. As the search space intrinsically reduces as the edges are progressively shrunk, <em>AutoShrink</em> explores more flexible search space with even less search time. We evaluate <em>AutoShrink</em> on image classification and language tasks by crafting <em>ShrinkCNN</em> and <em>ShrinkRNN</em> models. ShrinkCNN is able to achieve up to 48% parameter reduction and save 34% Multiply-Accumulates (MACs) on ImageNet-1K with comparable accuracy of state-of-the-art (SOTA) models. Specifically, both ShrinkCNN and ShrinkRNN are crafted within 1.5 GPU hours, which is 7.2× and 6.7× faster than the crafting time of SOTA CNN and RNN models, respectively.</p> ER -