Efficient Traffic Prediction Through Spatio-Temporal Distillation

Authors

  • Qianru Zhang The University of Hong Kong
  • Xinyi Gao University of Queensland
  • Haixin Wang UCLA Computer Science Department, University of California, Los Angeles
  • Siu Ming Yiu The University of Hong Kong
  • Hongzhi Yin University of Queensland

DOI:

https://doi.org/10.1609/aaai.v39i1.32096

Abstract

Graph neural networks (GNNs) have gained considerable attention in recent years for traffic flow prediction due to their ability to learn spatio-temporal pattern representations through a graph-based message-passing framework. Although GNNs have shown great promise in handling traffic datasets, their deployment in real-life applications has been hindered by scalability constraints arising from high-order message passing. Additionally, the over-smoothing problem of GNNs may lead to indistinguishable region representations as the number of layers increases, resulting in performance degradation. To address these challenges, we propose a new knowledge distillation paradigm termed LightST that transfers spatial and temporal knowledge from a high-capacity teacher to a lightweight student. Specifically, we introduce a spatio-temporal knowledge distillation framework that helps student MLPs capture graph-structured global spatio-temporal patterns while alleviating the over-smoothing effect with adaptive knowledge distillation. Extensive experiments verify that LightST significantly speeds up traffic flow predictions by 5X to 40X compared to state-of-the-art spatio-temporal GNNs, all while maintaining superior accuracy.

Downloads

Published

2025-04-11

How to Cite

Zhang, Q., Gao, X., Wang, H., Yiu, S. M., & Yin, H. (2025). Efficient Traffic Prediction Through Spatio-Temporal Distillation. Proceedings of the AAAI Conference on Artificial Intelligence, 39(1), 1093–1101. https://doi.org/10.1609/aaai.v39i1.32096

Issue

Section

AAAI Technical Track on Application Domains