Graph Invariant Learning with Subgraph Co-mixup for Out-of-Distribution Generalization

Authors

  • Tianrui Jia Beijing University of Posts and Telecommunications
  • Haoyang Li Tsinghua University
  • Cheng Yang School of Computer Science, Beijing University of Posts and Telecommunications
  • Tao Tao China Mobile Information Technology Co. Ltd.
  • Chuan Shi Beijing University of Posts and Telecommunications

DOI:

https://doi.org/10.1609/aaai.v38i8.28700

Keywords:

DMKM: Graph Mining, Social Network Analysis & Community

Abstract

Graph neural networks (GNNs) have been demonstrated to perform well in graph representation learning, but always lacking in generalization capability when tackling out-of-distribution (OOD) data. Graph invariant learning methods, backed by the invariance principle among defined multiple environments, have shown effectiveness in dealing with this issue. However, existing methods heavily rely on well-predefined or accurately generated environment partitions, which are hard to be obtained in practice, leading to sub-optimal OOD generalization performances. In this paper, we propose a novel graph invariant learning method based on invariant and variant patterns co-mixup strategy, which is capable of jointly generating mixed multiple environments and capturing invariant patterns from the mixed graph data. Specifically, we first adopt a subgraph extractor to identify invariant subgraphs. Subsequently, we design one novel co-mixup strategy, i.e., jointly conducting environment mixup and invariant mixup. For the environment mixup, we mix the variant environment-related subgraphs so as to generate sufficiently diverse multiple environments, which is important to guarantee the quality of the graph invariant learning. For the invariant mixup, we mix the invariant subgraphs, further encouraging to capture invariant patterns behind graphs while getting rid of spurious correlations for OOD generalization. We demonstrate that the proposed environment mixup and invariant mixup can mutually promote each other. Extensive experiments on both synthetic and real-world datasets demonstrate that our method significantly outperforms state-of-the-art under various distribution shifts.

Published

2024-03-24

How to Cite

Jia, T., Li, H., Yang, C., Tao, T., & Shi, C. (2024). Graph Invariant Learning with Subgraph Co-mixup for Out-of-Distribution Generalization. Proceedings of the AAAI Conference on Artificial Intelligence, 38(8), 8562-8570. https://doi.org/10.1609/aaai.v38i8.28700

Issue

Section

AAAI Technical Track on Data Mining & Knowledge Management