Decoupled Convolutions for CNNs

Authors

  • Guotian Xie Sun Yat-Sen University
  • Ting Zhang Microsoft Research
  • Kuiyuan Yang DeepMotion
  • Jianhuang Lai Sun Yat-Sen University
  • Jingdong Wang Microsoft Research

DOI:

https://doi.org/10.1609/aaai.v32i1.11638

Keywords:

decomposing filter, decoupled convolution, balance decoupling spatial convolution, spatial configuration

Abstract

In this paper, we are interested in designing small CNNs by decoupling the convolution along the spatial and channel domains. Most existing decoupling techniques focus on approximating the filter matrix through decomposition. In contrast, we provide a two-step interpretation of the standard convolution from the filter at a single location to all locations, which is exactly equivalent to the standard convolution. Motivated by the observations in our decoupling view, we propose an effective approach to relax the sparsity of the filter in spatial aggregation by learning a spatial configuration, and reduce the redundancy by reducing the number of intermediate channels. Our approach achieves comparable classification performance with the standard uncoupled convolution, but with a smaller model size over CIFAR-100, CIFAR-10 and ImageNet.

Downloads

Published

2018-04-29

How to Cite

Xie, G., Zhang, T., Yang, K., Lai, J., & Wang, J. (2018). Decoupled Convolutions for CNNs. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11638