Group-Wise Dynamic Dropout Based on Latent Semantic Variations


  • Zhiwei Ke Shenzhen University
  • Zhiwei Wen Shenzhen University
  • Weicheng Xie Shenzhen University
  • Yi Wang Dongguan University of Technology
  • Linlin Shen Shenzhen University



Dropout regularization has been widely used in various deep neural networks to combat overfitting. It works by training a network to be more robust on information-degraded data points for better generalization. Conventional dropout and variants are often applied to individual hidden units in a layer to break up co-adaptations of feature detectors. In this paper, we propose an adaptive dropout to reduce the co-adaptations in a group-wise manner by coarse semantic information to improve feature discriminability. In particular, we showed that adjusting the dropout probability based on local feature densities can not only improve the classification performance significantly but also enhance the network robustness against adversarial examples in some cases. The proposed approach was evaluated in comparison with the baseline and several state-of-the-art adaptive dropouts over four public datasets of Fashion-MNIST, CIFAR-10, CIFAR-100 and SVHN.




How to Cite

Ke, Z., Wen, Z., Xie, W., Wang, Y., & Shen, L. (2020). Group-Wise Dynamic Dropout Based on Latent Semantic Variations. Proceedings of the AAAI Conference on Artificial Intelligence, 34(07), 11229-11236.



AAAI Technical Track: Vision