GraFT: Infusing Pre-trained Transformers with Relational Structure for Time Series Forecasting

Authors

  • Yuqi Yuan University of Science and Technology Beijing
  • Xiong Luo University of Science and Technology Beijing
  • Qiaojuan Peng University of Science and Technology Beijing
  • Wenbing Zhao Cleveland State University

DOI:

https://doi.org/10.1609/aaai.v40i33.40029

Abstract

Large Language Models (LLMs) have recently emerged as a leading approach for multivariate time series forecasting. However, their effectiveness is hampered by a fundamental architectural mismatch: the permutation-invariant self-attention of Transformers lacks inductive biases for the strict temporal order and complex cross-variable dependencies inherent in time series. Existing methods often sidestep this issue with input-level alignment techniques rather than endowing the model itself with structural awareness. To address this gap, we introduce GraFT (Graph-infused Forecasting Transformer), a framework that systematically embeds relational priors into a pre-trained backbone by constructing a heterogeneous patch relation graph, which represents both universal temporal principles with static edges and instance-specific patterns with dynamic adaptive edges. To process this multi-relational structure, a relational graph convolutional network generates structure-aware representations, which are infused into the patch embeddings to provide explicit structural guidance to the Transformer's attention mechanism. Extensive experiments show that GraFT achieves state-of-the-art performance on long-term forecasting and zero-shot learning, outperforming leading LLM-based methods on eight standard benchmarks with an average Mean Squared Error (MSE) reduction of 14.4%.

Published

2026-03-14

How to Cite

Yuan, Y., Luo, X., Peng, Q., & Zhao, W. (2026). GraFT: Infusing Pre-trained Transformers with Relational Structure for Time Series Forecasting. Proceedings of the AAAI Conference on Artificial Intelligence, 40(33), 28041–28049. https://doi.org/10.1609/aaai.v40i33.40029

Issue

Section

AAAI Technical Track on Machine Learning X