Channel Regeneration: Improving Channel Utilization for Compact DNNs

Authors

  • Ankit Sharma University of Central Florida
  • Hassan Foroosh University of Central Florida

DOI:

https://doi.org/10.1609/aaai.v37i2.25316

Keywords:

CV: Applications, CV: Object Detection & Categorization, CV: Other Foundations of Computer Vision

Abstract

Overparameterized deep neural networks have redundant neurons that do not contribute to the network's accuracy. In this paper, we introduce a novel channel regeneration technique that reinvigorates these redundant channels by re-initializing its batch normalization scaling factor gamma. This re-initialization of BN gamma promotes regular weight updates during training. Furthermore, we show that channel regeneration encourages the channels to contribute equally to the learned representation and further boosts the generalization accuracy. We apply our technique at regular intervals of the training cycle to improve channel utilization. The solutions proposed in previous works either raise the total computational cost or increase the model complexity. Integrating the proposed channel regeneration technique into the training methodology of efficient architectures requires minimal effort and comes at no additional cost in size or memory. Extensive experiments on several image classification and semantic segmentation benchmarks demonstrate the effectiveness of applying the channel regeneration technique to compact architectures.

Downloads

Published

2023-06-26

How to Cite

Sharma, A., & Foroosh, H. (2023). Channel Regeneration: Improving Channel Utilization for Compact DNNs. Proceedings of the AAAI Conference on Artificial Intelligence, 37(2), 2218-2226. https://doi.org/10.1609/aaai.v37i2.25316

Issue

Section

AAAI Technical Track on Computer Vision II