Limited-Supervised Multi-Label Learning with Dependency Noise

Authors

  • Yejiang Wang Northeastern University, China
  • Yuhai Zhao Northeastern University, China
  • Zhengkui Wang Singapore Institute of Technology
  • Wen Shan Singapore University of Social Sciences
  • Xingwei Wang Northeastern University, China

DOI:

https://doi.org/10.1609/aaai.v38i14.29494

Keywords:

ML: Multi-class/Multi-label Learning & Extreme Classification, ML: Multi-instance/Multi-view Learning, ML: Optimization

Abstract

Limited-supervised multi-label learning (LML) leverages weak or noisy supervision for multi-label classification model training over data with label noise, which contain missing labels and/or redundant labels. Existing studies usually solve LML problems by assuming that label noise is independent of the input features and class labels, while ignoring the fact that noisy labels may depend on the input features (instance-dependent) and the classes (label-dependent) in many real-world applications. In this paper, we propose limited-supervised Multi-label Learning with Dependency Noise (MLDN) to simultaneously identify the instance-dependent and label-dependent label noise by factorizing the noise matrix as the outputs of a mapping from the feature and label representations. Meanwhile, we regularize the problem with the manifold constraint on noise matrix to preserve local relationships and uncover the manifold structure. Theoretically, we bound noise recover error for the resulting problem. We solve the problem by using a first-order scheme based on proximal operator, and the convergence rate of it is at least sub-linear. Extensive experiments conducted on various datasets demonstrate the superiority of our proposed method.

Published

2024-03-24

How to Cite

Wang, Y., Zhao, Y., Wang, Z., Shan, W., & Wang, X. (2024). Limited-Supervised Multi-Label Learning with Dependency Noise. Proceedings of the AAAI Conference on Artificial Intelligence, 38(14), 15662-15670. https://doi.org/10.1609/aaai.v38i14.29494

Issue

Section

AAAI Technical Track on Machine Learning V