Learning with Noisy Labels Using Hyperspherical Margin Weighting

Authors

  • Shuo Zhang Southeast University
  • Yuwen Li Southeast University
  • Zhongyu Wang Southeast University
  • Jianqing Li Southeast University
  • Chengyu Liu Southeast University

DOI:

https://doi.org/10.1609/aaai.v38i15.29626

Keywords:

ML: Multi-class/Multi-label Learning & Extreme Classification, ML: Adversarial Learning & Robustness

Abstract

Datasets often include noisy labels, but learning from them is difficult. Since mislabeled examples usually have larger loss values in training, the small-loss trick is regarded as a standard metric to identify the clean example from the training set for better performance. Nonetheless, this proposal ignores that some clean but hard-to-learn examples also generate large losses. They could be misidentified by this criterion. In this paper, we propose a new metric called the Integrated Area Margin (IAM), which is superior to the traditional small-loss trick, particularly in recognizing the clean but hard-to-learn examples. According to the IAM, we further offer the Hyperspherical Margin Weighting (HMW) approach. It is a new sample weighting strategy that restructures the importance of each example. It should be highlighted that our approach is universal and can strengthen various methods in this field. Experiments on both benchmark and real-world datasets indicate that our HMW outperforms many state-of-the-art approaches in learning with noisy label tasks. Codes are available at https://github.com/Zhangshuojackpot/HMW.

Published

2024-03-24

How to Cite

Zhang, S., Li, Y., Wang, Z., Li, J., & Liu, C. (2024). Learning with Noisy Labels Using Hyperspherical Margin Weighting. Proceedings of the AAAI Conference on Artificial Intelligence, 38(15), 16848-16856. https://doi.org/10.1609/aaai.v38i15.29626

Issue

Section

AAAI Technical Track on Machine Learning VI