Memory-Efficient Reversible Spiking Neural Networks

Authors

  • Hong Zhang State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou, China
  • Yu Zhang State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou, China Key Laboratory of Collaborative Sensing and Autonomous Unmanned Systems of Zhejiang Province, Hangzhou, China

DOI:

https://doi.org/10.1609/aaai.v38i15.29616

Keywords:

ML: Bio-inspired Learning, CV: Learning & Optimization for CV, CV: Object Detection & Categorization

Abstract

Spiking neural networks (SNNs) are potential competitors to artificial neural networks (ANNs) due to their high energy-efficiency on neuromorphic hardware. However, SNNs are unfolded over simulation time steps during the training process. Thus, SNNs require much more memory than ANNs, which impedes the training of deeper SNN models. In this paper, we propose the reversible spiking neural network to reduce the memory cost of intermediate activations and membrane potentials during training. Firstly, we extend the reversible architecture along temporal dimension and propose the reversible spiking block, which can reconstruct the computational graph and recompute all intermediate variables in forward pass with a reverse process. On this basis, we adopt the state-of-the-art SNN models to the reversible variants, namely reversible spiking ResNet (RevSResNet) and reversible spiking transformer (RevSFormer). Through experiments on static and neuromorphic datasets, we demonstrate that the memory cost per image of our reversible SNNs does not increase with the network depth. On CIFAR10 and CIFAR100 datasets, our RevSResNet37 and RevSFormer-4-384 achieve comparable accuracies and consume 3.79x and 3.00x lower GPU memory per image than their counterparts with roughly identical model complexity and parameters. We believe that this work can unleash the memory constraints in SNN training and pave the way for training extremely large and deep SNNs.

Published

2024-03-24

How to Cite

Zhang, H., & Zhang, Y. (2024). Memory-Efficient Reversible Spiking Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 38(15), 16759-16767. https://doi.org/10.1609/aaai.v38i15.29616

Issue

Section

AAAI Technical Track on Machine Learning VI