Stage-Aware Graph Contrastive Learning with Node-oriented Mixture of Experts
DOI:
https://doi.org/10.1609/aaai.v40i19.38695Abstract
Text-attributed graphs (TAGs), which associate rich textual descriptions with each node, are widely employed to represent complex relationships among real-world textual entities. Currently, representation learning for TAGs leverages large language models (LLMs) to transform node-matched textual descriptions into node features or labels, followed by the message passing in graph neural networks (GNNs) that further improves the expressiveness of graph representation learning. Nevertheless, a simple experiment we conducted demonstrates that not all LLMs are readily compatible with GNNs. A salient finding indicates that architectural heterogeneity among LLMs manifests as substantial performance gap across diverse TAGs representation learning. Moreover, the node semantics encoded by LLMs are often misaligned with the message passing in GNNs, causing performance collapse. Motivated by this observation, we propose a novel self-supervised graph learning framework called Stage-Aware Graph Contrastive Learning (SAGCL). In particular, we propose the node-oriented mixture of experts (NodeMoE) to assign suitable candidate experts for each node. It flexibly balances the strengths of different language experts by low-rank decomposition and reparameterization strategies. Subsequently, to align the inductive biases of graph structures with the semantic perception capabilities of LLMs, the message passing in GNNs is decoupled into the feature transformation stage and the feature propagation stage. Given the two stage views, stage-aware graph contrastive learning is proposed to match the node semantics encoded by the LLM with the locally aware topological patterns within the GNN via self-supervised contrastive learning. Experiments on eight datasets and three downstream tasks demonstrate the effectiveness of SAGCL.Published
2026-03-14
How to Cite
Zhu, X., Yan, Y., Long, S., Li, C., Chen, G., & Su, L. (2026). Stage-Aware Graph Contrastive Learning with Node-oriented Mixture of Experts. Proceedings of the AAAI Conference on Artificial Intelligence, 40(19), 16548–16556. https://doi.org/10.1609/aaai.v40i19.38695
Issue
Section
AAAI Technical Track on Data Mining & Knowledge Management III