Group-Wise Dynamic Dropout Based on Latent Semantic Variations

Authors

  • Zhiwei Ke Shenzhen University
  • Zhiwei Wen Shenzhen University
  • Weicheng Xie Shenzhen University
  • Yi Wang Dongguan University of Technology
  • Linlin Shen Shenzhen University

DOI:

https://doi.org/10.1609/aaai.v34i07.6782

Abstract

Dropout regularization has been widely used in various deep neural networks to combat overfitting. It works by training a network to be more robust on information-degraded data points for better generalization. Conventional dropout and variants are often applied to individual hidden units in a layer to break up co-adaptations of feature detectors. In this paper, we propose an adaptive dropout to reduce the co-adaptations in a group-wise manner by coarse semantic information to improve feature discriminability. In particular, we showed that adjusting the dropout probability based on local feature densities can not only improve the classification performance significantly but also enhance the network robustness against adversarial examples in some cases. The proposed approach was evaluated in comparison with the baseline and several state-of-the-art adaptive dropouts over four public datasets of Fashion-MNIST, CIFAR-10, CIFAR-100 and SVHN.

Downloads

Published

2020-04-03

How to Cite

Ke, Z., Wen, Z., Xie, W., Wang, Y., & Shen, L. (2020). Group-Wise Dynamic Dropout Based on Latent Semantic Variations. Proceedings of the AAAI Conference on Artificial Intelligence, 34(07), 11229-11236. https://doi.org/10.1609/aaai.v34i07.6782

Issue

Section

AAAI Technical Track: Vision