ParZC: Parametric Zero-Cost Proxies for Efficient NAS
DOI:
https://doi.org/10.1609/aaai.v39i15.33793Abstract
Recent advancements in Zero-shot Neural Architecture Search (NAS) highlight the ability of zero-cost proxies in identifying superior architecture. However, we identify a critical issue with current zero-cost proxies: they aggregate node-wise zero-cost statistics without considering that not all nodes in a neural network equally impact performance estimation. Our observations reveal that node-wise zero-cost statistics significantly vary in their contributions to performance, with each node exhibiting a degree of uncertainty. Based on this insight, we introduce a novel method called Parametric Zero-Cost Proxies (ParZC) framework to enhance the adaptability of zero-cost proxies through parameterization. To address the node indiscrimination, we propose a Mixer Architecture with Bayesian Network (MABN) to explore the node-wise zero-cost statistics and estimate node-specific uncertainty. Moreover, we propose DiffKendall as a loss function to improve ranking consistency. Comprehensive experiments on NAS-Bench-101, 201, and NDS demonstrate the superiority of our proposed ParZC compared to existing zero-shot NAS methods. Additionally, we demonstrate the versatility and adaptability of ParZC on Vision Transformer search space.Downloads
Published
2025-04-11
How to Cite
Dong, P., Li, L., Tang, Z., Liu, X., Wei, Z., Wang, Q., & Chu, X. (2025). ParZC: Parametric Zero-Cost Proxies for Efficient NAS. Proceedings of the AAAI Conference on Artificial Intelligence, 39(15), 16327–16335. https://doi.org/10.1609/aaai.v39i15.33793
Issue
Section
AAAI Technical Track on Machine Learning I