Fast and Controllable Post-training Sparsity: Learning Optimal Sparsity Allocation with Global Constraint in Minutes

Authors

  • Ruihao Gong State Key Laboratory of Complex & Critical Software Environment, Beihang University SenseTime Research
  • Yang Yong SenseTime Research
  • Zining Wang State Key Laboratory of Complex & Critical Software Environment, Beihang University
  • Jinyang Guo Institute of Artificial Intelligence, Beihang University State Key Laboratory of Complex & Critical Software Environment, Beihang University
  • Xiuying Wei SenseTime Research
  • Yuqing Ma Institute of Artificial Intelligence, Beihang University State Key Laboratory of Complex & Critical Software Environment, Beihang University
  • Xianglong Liu State Key Laboratory of Complex & Critical Software Environment, Beihang University

DOI:

https://doi.org/10.1609/aaai.v38i11.29108

Keywords:

ML: Learning on the Edge & Model Compression, CV: Applications

Abstract

Neural network sparsity has attracted many research interests due to its similarity to biological schemes and high energy efficiency. However, existing methods depend on long-time training or fine-tuning, which prevents large-scale applications. Recently, some works focusing on post-training sparsity (PTS) have emerged. They get rid of the high training cost but usually suffer from distinct accuracy degradation due to neglect of the reasonable sparsity rate at each layer. Previous methods for finding sparsity rates mainly focus on the training-aware scenario, which usually fails to converge stably under the PTS setting with limited data and much less training cost. In this paper, we propose a fast and controllable post-training sparsity (FCPTS) framework. By incorporating a differentiable bridge function and a controllable optimization objective, our method allows for rapid and accurate sparsity allocation learning in minutes, with the added assurance of convergence to a predetermined global sparsity rate. Equipped with these techniques, we can surpass the state-of-the-art methods by a large margin, e.g., over 30\% improvement for ResNet-50 on ImageNet under the sparsity rate of 80\%. Our plug-and-play code and supplementary materials are open-sourced at https://github.com/ModelTC/FCPTS.

Published

2024-03-24

How to Cite

Gong, R., Yong, Y., Wang, Z., Guo, J., Wei, X., Ma, Y., & Liu, X. (2024). Fast and Controllable Post-training Sparsity: Learning Optimal Sparsity Allocation with Global Constraint in Minutes. Proceedings of the AAAI Conference on Artificial Intelligence, 38(11), 12190-12198. https://doi.org/10.1609/aaai.v38i11.29108

Issue

Section

AAAI Technical Track on Machine Learning II