SpikingSSMs: Learning Long Sequences with Sparse and Parallel Spiking State Space Models

Authors

  • Shuaijie Shen Department of Computer Science and Engineering, Southern University of Science and Technology, Shenzhen ACSLab, Huawei Technologies Co., Ltd., Shenzhen
  • Chao Wang Department of Computer Science and Engineering, Southern University of Science and Technology, Shenzhen ACSLab, Huawei Technologies Co., Ltd., Shenzhen
  • Renzhuo Huang Department of Computer Science and Engineering, Southern University of Science and Technology, Shenzhen ACSLab, Huawei Technologies Co., Ltd., Shenzhen
  • Yan Zhong ACSLab, Huawei Technologies Co., Ltd., Shenzhen School of Mathematical Sciences, Peking University, Beijing
  • Qinghai Guo ACSLab, Huawei Technologies Co., Ltd., Shenzhen
  • Zhichao Lu Department of Computer Science, City University of Hong Kong, Hong Kong
  • Jianguo Zhang Department of Computer Science and Engineering, Southern University of Science and Technology, Shenzhen Pengcheng Laboratory, Shenzhen
  • Luziwei Leng ACSLab, Huawei Technologies Co., Ltd., Shenzhen

DOI:

https://doi.org/10.1609/aaai.v39i19.34245

Abstract

Known as low energy consumption networks, spiking neural networks (SNNs) have gained a lot of attention within the past decades. While SNNs are increasing competitive with artificial neural networks (ANNs) for vision tasks, they are rarely used for long sequence tasks, despite their intrinsic temporal dynamics. In this work, we develop spiking state space models (SpikingSSMs) for long sequence learning by leveraging on the sequence learning abilities of state space models (SSMs). Inspired by dendritic neuron structure, we hierarchically integrate neuronal dynamics with the original SSM block, meanwhile realizing sparse synaptic computation. Furthermore, to solve the conflict of event-driven neuronal dynamics with parallel computing, we propose a light-weight surrogate dynamic network which accurately predicts the after-reset membrane potential and compatible to learnable thresholds, enabling orders of acceleration in training speed compared with conventional iterative methods. On the long range arena benchmark task, SpikingSSM achieves competitive performance to state-of-the-art SSMs meanwhile realizing on average 90% of network sparsity. On language modeling, our network significantly surpasses existing spiking large language models (spikingLLMs) on the WikiText-103 dataset with only a third of the model size, demonstrating its potential as backbone architecture for low computation cost LLMs.

Downloads

Published

2025-04-11

How to Cite

Shen, S., Wang, C., Huang, R., Zhong, Y., Guo, Q., Lu, Z., … Leng, L. (2025). SpikingSSMs: Learning Long Sequences with Sparse and Parallel Spiking State Space Models. Proceedings of the AAAI Conference on Artificial Intelligence, 39(19), 20380–20388. https://doi.org/10.1609/aaai.v39i19.34245

Issue

Section

AAAI Technical Track on Machine Learning V