Large Language Model Meets Graph Neural Network in Knowledge Distillation

Authors

  • Shengxiang Hu Shanghai University
  • Guobing Zou Shanghai University
  • Song Yang Shanghai University
  • Shiyi Lin Shanghai University
  • Yanglan Gan Donghua University, Shanghai
  • Bofeng Zhang Shanghai Polytechnic University
  • Yixin Chen Washington University, Saint Louis

DOI:

https://doi.org/10.1609/aaai.v39i16.33901

Abstract

While Large Language Models (LLMs) show promise for Text-Attributed Graphs (TAGs) learning, their deployment is hindered by computational demands. Graph Neural Networks (GNNs) are efficient but struggle with TAGs' complex semantics. We propose LinguGKD, a novel LLM-to-GNN knowledge distillation framework that enables transferring both local semantic details and global structural information from LLMs to GNNs. First, it introduces TAG-oriented instruction tuning, enhancing LLMs with graph-specific knowledge through carefully designed prompts. Next, it develops a layer-adaptive multi-scale contrastive distillation strategy aligning LLM and GNN features at multiple granularities, from node-level to graph-level. Finally, the distilled GNNs combine the semantic richness of LLMs with the computational efficiency of traditional GNNs. Experiments demonstrate that LinguGKD outperforms existing graph distillation frameworks, the distilled simple GNNs achieve comparable or superior performance to more complex GNNs and teacher LLMs, while maintaining computational efficiency. This work bridges the gap between LLMs and GNNs, facilitating advanced graph learning in resource-constrained environments and providing a framework to leverage ongoing LLM advancements for GNN improvement.

Downloads

Published

2025-04-11

How to Cite

Hu, S., Zou, G., Yang, S., Lin, S., Gan, Y., Zhang, B., & Chen, Y. (2025). Large Language Model Meets Graph Neural Network in Knowledge Distillation. Proceedings of the AAAI Conference on Artificial Intelligence, 39(16), 17295–17304. https://doi.org/10.1609/aaai.v39i16.33901

Issue

Section

AAAI Technical Track on Machine Learning II