Learning with Open-world Noisy Data via Class-independent Margin in Dual Representation Space

Authors

  • Linchao Pan College of Computer Science and Software Engineering, Shenzhen University
  • Can Gao College of Computer Science and Software Engineering, Shenzhen University Guangdong Provincial Key Laboratory of Intelligent Information Processing
  • Jie Zhou Guangdong Provincial Key Laboratory of Intelligent Information Processing National Engineering Laboratory for Big Data System Computing Technology, Shenzhen University
  • Jinbao Wang Guangdong Provincial Key Laboratory of Intelligent Information Processing National Engineering Laboratory for Big Data System Computing Technology, Shenzhen University

DOI:

https://doi.org/10.1609/aaai.v39i6.32673

Abstract

Learning with Noisy Labels (LNL) aims to improve the model generalization when facing data with noisy labels, and existing methods generally assume that noisy labels come from known classes, called closed-set noise. However, in real-world scenarios, noisy labels from similar unknown classes, i.e., open-set noise, may occur during the training and inference stage. Such open-world noisy labels may significantly impact the performance of LNL methods. In this study, we propose a novel dual-space joint learning method to robustly handle the open-world noise. To mitigate model overfitting on closed-set and open-set noises, a dual representation space is constructed by two networks. One is a projection network that learns shared representations in the prototype space, while the other is a One-Vs-All (OVA) network that makes predictions using unique semantic representations in the class-independent space. Then, bi-level contrastive learning and consistency regularization are introduced in two spaces to enhance the detection capability for data with unknown classes. To benefit from the memorization effects across different types of samples, class-independent margin criteria are designed for sample identification, which selects clean samples, weights closed-set noise, and filters open-set noise effectively. Extensive experiments demonstrate that our method outperforms the state-of-the-art methods and achieves an average accuracy improvement of 4.55\% and an AUROC improvement of 6.17\% on CIFAR80N.

Downloads

Published

2025-04-11

How to Cite

Pan, L., Gao, C., Zhou, J., & Wang, J. (2025). Learning with Open-world Noisy Data via Class-independent Margin in Dual Representation Space. Proceedings of the AAAI Conference on Artificial Intelligence, 39(6), 6290-6298. https://doi.org/10.1609/aaai.v39i6.32673

Issue

Section

AAAI Technical Track on Computer Vision V