Regularizing Graph Neural Networks via Consistency-Diversity Graph Augmentations

Authors

  • Deyu Bo Beijing University of Posts and Telecommunications
  • Binbin Hu Ant Group
  • Xiao Wang Beijing University of Posts and Telecommunications
  • Zhiqiang Zhang Ant Group
  • Chuan Shi Beijing University of Posts and Telecommunications
  • Jun Zhou Ant Group

DOI:

https://doi.org/10.1609/aaai.v36i4.20307

Keywords:

Data Mining & Knowledge Management (DMKM)

Abstract

Despite the remarkable performance of graph neural networks (GNNs) in semi-supervised learning, it is criticized for not making full use of unlabeled data and suffering from over-fitting. Recently, graph data augmentation, used to improve both accuracy and generalization of GNNs, has received considerable attentions. However, one fundamental question is how to evaluate the quality of graph augmentations in principle? In this paper, we propose two metrics, Consistency and Diversity, from the aspects of augmentation correctness and generalization. Moreover, we discover that existing augmentations fall into a dilemma between these two metrics. Can we find a graph augmentation satisfying both consistency and diversity? A well-informed answer can help us understand the mechanism behind graph augmentation and improve the performance of GNNs. To tackle this challenge, we analyze two representative semi-supervised learning algorithms: label propagation (LP) and consistency regularization (CR). We find that LP utilizes the prior knowledge of graphs to improve consistency and CR adopts variable augmentations to promote diversity. Based on this discovery, we treat neighbors as augmentations to capture the prior knowledge embodying homophily assumption, which promises a high consistency of augmentations. To further promote diversity, we randomly replace the immediate neighbors of each node with its remote neighbors. After that, a neighbor-constrained regularization is proposed to enforce the predictions of the augmented neighbors to be consistent with each other. Extensive experiments on five real-world graphs validate the superiority of our method in improving the accuracy and generalization of GNNs.

Downloads

Published

2022-06-28

How to Cite

Bo, D., Hu, B., Wang, X., Zhang, Z., Shi, C., & Zhou, J. (2022). Regularizing Graph Neural Networks via Consistency-Diversity Graph Augmentations. Proceedings of the AAAI Conference on Artificial Intelligence, 36(4), 3913-3921. https://doi.org/10.1609/aaai.v36i4.20307

Issue

Section

AAAI Technical Track on Data Mining and Knowledge Management