Noisy Node Classification by Bi-level Optimization Based Multi-Teacher Distillation

Authors

  • Yujing Liu Guangxi Normal University
  • Zongqian Wu University of Electronic Science and Technology of China
  • Zhengyu Lu Guangxi Normal University Anyang Institute of Technology
  • Ci Nie Guangxi Normal University
  • Guoqiu Wen Guangxi Normal University
  • Yonghua Zhu Guangxi Normal University Singapore University of Technology and Design
  • Xiaofeng Zhu Guangxi Normal University University of Electronic Science and Technology of China

DOI:

https://doi.org/10.1609/aaai.v39i18.34095

Abstract

Previous graph neural networks (GNNs) usually assume that the graph data is with clean labels for representation learning, but it is not true in real applications. In this paper, we propose a new multi-teacher distillation method based on bi-level optimization (namely BO-NNC), to conduct noisy node classification on the graph data. Specifically, we first employ multiple self-supervised learning methods to train diverse teacher models, and then aggregate their predictions through a teacher weight matrix. Furthermore, we design a new bi-level optimization strategy to dynamically adjust the teacher weight matrix based on the training progress of the student model. Finally, we design a label improvement module to improve the label quality. Extensive experimental results on real datasets show that our method achieves the best results compared to state-of-the-art methods.

Published

2025-04-11

How to Cite

Liu, Y., Wu, Z., Lu, Z., Nie, C., Wen, G., Zhu, Y., & Zhu, X. (2025). Noisy Node Classification by Bi-level Optimization Based Multi-Teacher Distillation. Proceedings of the AAAI Conference on Artificial Intelligence, 39(18), 19033–19040. https://doi.org/10.1609/aaai.v39i18.34095

Issue

Section

AAAI Technical Track on Machine Learning IV