TY - JOUR AU - Tang, Yehui AU - Wang, Yunhe AU - Xu, Yixing AU - Shi, Boxin AU - Xu, Chao AU - Xu, Chunjing AU - Xu, Chang PY - 2020/04/03 Y2 - 2024/03/28 TI - Beyond Dropout: Feature Map Distortion to Regularize Deep Neural Networks JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 34 IS - 04 SE - AAAI Technical Track: Machine Learning DO - 10.1609/aaai.v34i04.6057 UR - https://ojs.aaai.org/index.php/AAAI/article/view/6057 SP - 5964-5971 AB - <p>Deep neural networks often consist of a great number of trainable parameters for extracting powerful features from given datasets. One one hand, massive trainable parameters significantly enhance the performance of these deep networks. One the other hand, they bring the problem of over-fitting. To this end, dropout based methods disable some elements in the output feature maps during the training phase for reducing the co-adaptation of neurons. Although the generalization ability of the resulting models can be enhanced by these approaches, the conventional binary dropout is not the optimal solution. Therefore, we investigate the empirical Rademacher complexity related to intermediate layers of deep neural networks and propose a feature distortion method for addressing the aforementioned problem. In the training period, randomly selected elements in the feature maps will be replaced with specific values by exploiting the generalization error bound. The superiority of the proposed feature map distortion for producing deep neural network with higher testing performance is analyzed and demonstrated on several benchmark image datasets.</p> ER -