Convolutional Neural Network Compression through Generalized Kronecker Product Decomposition

Authors

  • Marawan Gamal Abdel Hameed Noah’s Ark Lab, Huawei Technologies Canada University of Waterloo
  • Marzieh S. Tahaei Noah’s Ark Lab, Huawei Technologies Canada
  • Ali Mosleh Noah’s Ark Lab, Huawei Technologies Canada
  • Vahid Partovi Nia Noah’s Ark Lab, Huawei Technologies Canada

DOI:

https://doi.org/10.1609/aaai.v36i1.19958

Keywords:

Computer Vision (CV)

Abstract

Modern Convolutional Neural Network (CNN) architectures, despite their superiority in solving various problems, are generally too large to be deployed on resource constrained edge devices. In this paper, we reduce memory usage and floating-point operations required by convolutional layers in CNNs. We compress these layers by generalizing the Kronecker Product Decomposition to apply to multidimensional tensors, leading to the Generalized Kronecker Product Decomposition (GKPD). Our approach yields a plug-and-play module that can be used as a drop-in replacement for any convolutional layer. Experimental results for image classification on CIFAR-10 and ImageNet datasets using ResNet, MobileNetv2 and SeNet architectures substantiate the effectiveness of our proposed approach. We find that GKPD outperforms state-of-the-art decomposition methods including Tensor-Train and Tensor-Ring as well as other relevant compression methods such as pruning and knowledge distillation.

Downloads

Published

2022-06-28

How to Cite

Hameed, M. G. A., Tahaei, M. S., Mosleh, A., & Partovi Nia, V. (2022). Convolutional Neural Network Compression through Generalized Kronecker Product Decomposition. Proceedings of the AAAI Conference on Artificial Intelligence, 36(1), 771-779. https://doi.org/10.1609/aaai.v36i1.19958

Issue

Section

AAAI Technical Track on Computer Vision I