Learning from Noisy Labels with Complementary Loss Functions

Authors

  • Deng-Bao Wang Southeast University Key Laboratory of Computer Network and Information Integration (Southeast University), Ministry of Education
  • Yong Wen Noah’s Ark Lab, Huawei Technologies
  • Lujia Pan Noah’s Ark Lab, Huawei Technologies NSKEYLAB, Xi’an Jiaotong University
  • Min-Ling Zhang Southeast University Key Laboratory of Computer Network and Information Integration (Southeast University), Ministry of Education Collaborative Innovation Center of Wireless Communications Technology

DOI:

https://doi.org/10.1609/aaai.v35i11.17213

Keywords:

Classification and Regression, (Deep) Neural Network Algorithms

Abstract

Recent researches reveal that deep neural networks are sensitive to label noises hence leading to poor generalization performance in some tasks. Although different robust loss functions have been proposed to remedy this issue, they suffer from an underfitting problem, thus are not sufficient to learn accurate models. On the other hand, the commonly used Cross Entropy (CE) loss, which shows high performance in standard supervised learning (with clean supervision), is non-robust to label noise. In this paper, we propose a general framework to learn robust deep neural networks with complementary loss functions. In our framework, CE and robust loss play complementary roles in a joint learning objective as per their learning sufficiency and robustness properties respectively. Specifically, we find that by exploiting the memorization effect of neural networks, we can easily filter out a proportion of hard samples and generate reliable pseudo labels for easy samples, and thus reduce the label noise to a quite low level. Then, we simply learn with CE on pseudo supervision and robust loss on original noisy supervision. In this procedure, CE can guarantee the sufficiency of optimization while the robust loss can be regarded as the supplement. Experimental results on benchmark classification datasets indicate that the proposed method helps achieve robust and sufficient deep neural network training simultaneously.

Downloads

Published

2021-05-18

How to Cite

Wang, D.-B., Wen, Y., Pan, L., & Zhang, M.-L. (2021). Learning from Noisy Labels with Complementary Loss Functions. Proceedings of the AAAI Conference on Artificial Intelligence, 35(11), 10111-10119. https://doi.org/10.1609/aaai.v35i11.17213

Issue

Section

AAAI Technical Track on Machine Learning IV