Scaling Up Dynamic Graph Representation Learning via Spiking Neural Networks


  • Jintang Li Sun Yat-sen University
  • Zhouxin Yu Sun Yat-sen University
  • Zulun Zhu Rochester Institute of Technology
  • Liang Chen Sun Yat-sen University
  • Qi Yu Rochester Institute of Technology
  • Zibin Zheng Sun Yat-sen University
  • Sheng Tian Ant Group
  • Ruofan Wu Ant Group
  • Changhua Meng Ant Group



ML: Graph-based Machine Learning, DMKM: Graph Mining, Social Network Analysis & Community Mining, DMKM: Mining of Spatial, Temporal or Spatio-Temporal Data


Recent years have seen a surge in research on dynamic graph representation learning, which aims to model temporal graphs that are dynamic and evolving constantly over time. However, current work typically models graph dynamics with recurrent neural networks (RNNs), making them suffer seriously from computation and memory overheads on large temporal graphs. So far, scalability of dynamic graph representation learning on large temporal graphs remains one of the major challenges. In this paper, we present a scalable framework, namely SpikeNet, to efficiently capture the temporal and structural patterns of temporal graphs. We explore a new direction in that we can capture the evolving dynamics of temporal graphs with spiking neural networks (SNNs) instead of RNNs. As a low-power alternative to RNNs, SNNs explicitly model graph dynamics as spike trains of neuron populations and enable spike-based propagation in an efficient way. Experiments on three large real-world temporal graph datasets demonstrate that SpikeNet outperforms strong baselines on the temporal node classification task with lower computational costs. Particularly, SpikeNet generalizes to a large temporal graph (2.7M nodes and 13.9M edges) with significantly fewer parameters and computation overheads.




How to Cite

Li, J., Yu, Z., Zhu, Z., Chen, L., Yu, Q., Zheng, Z., Tian, S., Wu, R., & Meng, C. (2023). Scaling Up Dynamic Graph Representation Learning via Spiking Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 37(7), 8588-8596.



AAAI Technical Track on Machine Learning II