CauVQ: Causal Vector Quantization for Graph OOD Generalization

Authors

  • Weihong Zhang Shanxi University
  • Liang Bai Shanxi University
  • Hangyuan Du Shanxi University
  • Xian Yang University of Manchester

DOI:

https://doi.org/10.1609/aaai.v40i33.40070

Abstract

Graph Neural Networks (GNNs) perform well on in-distribution data but often fail under out-of-distribution (OOD) shifts due to reliance on spurious patterns. To address this, we propose CauVQ, a causal vector quantization framework that improves OOD generalization by identifying and leveraging invariant substructures that are causally predictive. To construct stable and symbolic graph representations, CauVQ decomposes each input into local substructures and maps them to a discrete codebook of prototypical motifs. This enables consistent and interpretable encoding across diverse graph domains. To isolate the causal substructures, we maximize their mutual information with graph labels and refine their representations using a learnable interaction matrix and a causal attention mechanism. Furthermore, we introduce a counterfactual regularization strategy to enforce prediction stability under substructure perturbations, encouraging the model to focus on truly causal patterns rather than superficial shortcuts. Extensive experiments across standard and OOD benchmarks demonstrate that CauVQ consistently outperforms state-of-the-art baselines in robustness and interpretability. Our framework offers a promising step toward reliable, explainable, and distribution-aware graph learning.

Downloads

Published

2026-03-14

How to Cite

Zhang, W., Bai, L., Du, H., & Yang, X. (2026). CauVQ: Causal Vector Quantization for Graph OOD Generalization. Proceedings of the AAAI Conference on Artificial Intelligence, 40(33), 28409–28417. https://doi.org/10.1609/aaai.v40i33.40070

Issue

Section

AAAI Technical Track on Machine Learning X