Curriculum-Enhanced Residual Soft An-Isotropic Normalization for Over-Smoothness in Deep GNNs

Authors

  • Jin Li College of Computer and Data Science, Fuzhou University AI Thrust, Information Hub, HKUST (Guangzhou)
  • Qirong Zhang College of Computer and Data Science, Fuzhou University
  • Shuling Xu College of Computer and Data Science, Fuzhou University
  • Xinlong Chen College of Computer and Data Science, Fuzhou University
  • Longkun Guo College of Computer and Data Science, Fuzhou University Shandong Fundamental Research Center for Computer Science
  • Yang-Geng Fu College of Computer and Data Science, Fuzhou University

DOI:

https://doi.org/10.1609/aaai.v38i12.29256

Keywords:

ML: Graph-based Machine Learning, DMKM: Graph Mining, Social Network Analysis & Community, ML: Deep Learning Algorithms, ML: Semi-Supervised Learning

Abstract

Despite Graph neural networks' significant performance gain over many classic techniques in various graph-related downstream tasks, their successes are restricted in shallow models due to over-smoothness and the difficulties of optimizations among many other issues. In this paper, to alleviate the over-smoothing issue, we propose a soft graph normalization method to preserve the diversities of node embeddings and prevent indiscrimination due to possible over-closeness. Combined with residual connections, we analyze the reason why the method can effectively capture the knowledge in both input graph structures and node features even with deep networks. Additionally, inspired by Curriculum Learning that learns easy examples before the hard ones, we propose a novel label-smoothing-based learning framework to enhance the optimization of deep GNNs, which iteratively smooths labels in an auxiliary graph and constructs many gradual non-smooth tasks for extracting increasingly complex knowledge and gradually discriminating nodes from coarse to fine. The method arguably reduces the risk of overfitting and generalizes better results. Finally, extensive experiments are carried out to demonstrate the effectiveness and potential of the proposed model and learning framework through comparison with twelve existing baselines including the state-of-the-art methods on twelve real-world node classification benchmarks.

Published

2024-03-24

How to Cite

Li, J., Zhang, Q., Xu, S., Chen, X., Guo, L., & Fu, Y.-G. (2024). Curriculum-Enhanced Residual Soft An-Isotropic Normalization for Over-Smoothness in Deep GNNs. Proceedings of the AAAI Conference on Artificial Intelligence, 38(12), 13528-13536. https://doi.org/10.1609/aaai.v38i12.29256

Issue

Section

AAAI Technical Track on Machine Learning III