Global-Lens Transformers: Adaptive Token Mixing for Dynamic Link Prediction

Authors

  • Tao Zou Beihang University
  • Chengfeng Wu Tsinghua Shenzhen International Graduate School
  • Tianxi Liao Beihang University
  • Junchen Ye Beihang University
  • Bowen Du Beihang University Zhongguancun Laboratory

DOI:

https://doi.org/10.1609/aaai.v40i19.38698

Abstract

Dynamic graph learning plays a pivotal role in modeling evolving relationships over time, especially for temporal link prediction tasks in domains such as traffic systems, social networks, and recommendation platforms. While Transformer-based models have demonstrated strong performance by capturing long-range temporal dependencies, their reliance on self-attention results in quadratic complexity with respect to sequence length, limiting scalability on high-frequency or large-scale graphs. In this work, we revisit the necessity of self-attention in dynamic graph modeling. Inspired by recent findings that attribute the success of Transformers more to their architectural design than attention itself, we propose GLFormer, a novel attention-free Transformer-style framework for dynamic graphs. GLFormer introduces an adaptive token mixer that performs context-aware local aggregation based on interaction order and time intervals. To capture long-term dependencies, we further design a hierarchical aggregation module that expands the temporal receptive field by stacking local token mixers across layers. Experiments on six widely used dynamic graph benchmarks show that GLFormer achieves competitive or superior performance, which reveals that attention-free architectures can match or surpass Transformer baselines in dynamic graph settings with significantly improved efficiency.

Downloads

Published

2026-03-14

How to Cite

Zou, T., Wu, C., Liao, T., Ye, J., & Du, B. (2026). Global-Lens Transformers: Adaptive Token Mixing for Dynamic Link Prediction. Proceedings of the AAAI Conference on Artificial Intelligence, 40(19), 16575–16583. https://doi.org/10.1609/aaai.v40i19.38698

Issue

Section

AAAI Technical Track on Data Mining & Knowledge Management III