Substructure Aware Graph Neural Networks
DOI:
https://doi.org/10.1609/aaai.v37i9.26318Keywords:
ML: Graph-based Machine Learning, DMKM: Graph Mining, Social Network Analysis & Community Mining, ML: Representation LearningAbstract
Despite the great achievements of Graph Neural Networks (GNNs) in graph learning, conventional GNNs struggle to break through the upper limit of the expressiveness of first-order Weisfeiler-Leman graph isomorphism test algorithm (1-WL) due to the consistency of the propagation paradigm of GNNs with the 1-WL.Based on the fact that it is easier to distinguish the original graph through subgraphs, we propose a novel framework neural network framework called Substructure Aware Graph Neural Networks (SAGNN) to address these issues. We first propose a Cut subgraph which can be obtained from the original graph by continuously and selectively removing edges. Then we extend the random walk encoding paradigm to the return probability of the rooted node on the subgraph to capture the structural information and use it as a node feature to improve the expressiveness of GNNs. We theoretically prove that our framework is more powerful than 1-WL, and is superior in structure perception. Our extensive experiments demonstrate the effectiveness of our framework, achieving state-of-the-art performance on a variety of well-proven graph tasks, and GNNs equipped with our framework perform flawlessly even in 3-WL failed graphs. Specifically, our framework achieves a maximum performance improvement of 83% compared to the base models and 32% compared to the previous state-of-the-art methods.Downloads
Published
2023-06-26
How to Cite
Zeng, D., Liu, W., Chen, W., Zhou, L., Zhang, M., & Qu, H. (2023). Substructure Aware Graph Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 37(9), 11129-11137. https://doi.org/10.1609/aaai.v37i9.26318
Issue
Section
AAAI Technical Track on Machine Learning IV