TY - JOUR AU - Tang, Yehui AU - You, Shan AU - Xu, Chang AU - Han, Jin AU - Qian, Chen AU - Shi, Boxin AU - Xu, Chao AU - Zhang, Changshui PY - 2020/04/03 Y2 - 2024/03/29 TI - Reborn Filters: Pruning Convolutional Neural Networks with Limited Data JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 34 IS - 04 SE - AAAI Technical Track: Machine Learning DO - 10.1609/aaai.v34i04.6058 UR - https://ojs.aaai.org/index.php/AAAI/article/view/6058 SP - 5972-5980 AB - <p>Channel pruning is effective in compressing the pretrained CNNs for their deployment on low-end edge devices. Most existing methods independently prune some of the original channels and need the complete original dataset to fix the performance drop after pruning. However, due to commercial protection or data privacy, users may only have access to a tiny portion of training examples, which could be insufficient for the performance recovery. In this paper, for pruning with limited data, we propose to use all original filters to directly develop new compact filters, named reborn filters, so that all useful structure priors in the original filters can be well preserved into the pruned networks, alleviating the performance drop accordingly. During training, reborn filters can be easily implemented via 1×1 convolutional layers and then be fused in the inference stage for acceleration. Based on reborn filters, the proposed channel pruning algorithm shows its effectiveness and superiority on extensive experiments.</p> ER -