GT-SNT: A Linear-Time Transformer for Large-Scale Graphs via Spiking Node Tokenization

Authors

  • Huizhe Zhang Sun Yat-sen University
  • Jintang Li Xiamen University
  • Yuchang Zhu Sun Yat-sen University
  • Huazhen Zhong Sun Yat-sen University
  • Liang Chen Sun Yat-sen University

DOI:

https://doi.org/10.1609/aaai.v40i19.38667

Abstract

Graph Transformers (GTs), which integrate message passing and self-attention mechanisms simultaneously, have achieved promising empirical results in graph prediction tasks. However, the design of scalable and topology-aware node tokenization has lagged behind other modalities. This gap becomes critical as the quadratic complexity of full attention renders them impractical on large-scale graphs. Recently, Spiking Neural Networks (SNNs), as brain-inspired models, provided an energy-saving scheme to convert input intensity into discrete spike-based representations through event-driven spiking neurons. Inspired by these characteristics, we propose a linear-time Graph Transformer with Spiking Node Tokenization (GT-SNT) for node classification. By integrating multi-step feature propagation with SNNs, spiking node tokenization generates compact, locality-aware spike count embeddings as node tokens to avoid predefined codebooks and their utilization issues. The codebook guided self-attention leverages these tokens to perform node-to-token attention for linear-time global context aggregation. In experiments, we compare GT-SNT with other state-of-the-art baselines on node classification datasets ranging from small to large. Experimental results show that GT-SNT achieves comparable performances on most datasets and reaches up to 130× faster inference speed compared to other GTs.

Downloads

Published

2026-03-14

How to Cite

Zhang, H., Li, J., Zhu, Y., Zhong, H., & Chen, L. (2026). GT-SNT: A Linear-Time Transformer for Large-Scale Graphs via Spiking Node Tokenization. Proceedings of the AAAI Conference on Artificial Intelligence, 40(19), 16298-16306. https://doi.org/10.1609/aaai.v40i19.38667

Issue

Section

AAAI Technical Track on Data Mining & Knowledge Management III