Shrinking Your TimeStep: Towards Low-Latency Neuromorphic Object Recognition with Spiking Neural Networks

Authors

  • Yongqi Ding University of Electronic Science and Technology of China
  • Lin Zuo University of Electronic Science and Technology of China
  • Mengmeng Jing University of Electronic Science and Technology of China
  • Pei He University of Electronic Science and Technology of China
  • Yongjun Xiao University of Electronic Science and Technology of China

DOI:

https://doi.org/10.1609/aaai.v38i10.29066

Keywords:

ML: Bio-inspired Learning, CV: Object Detection & Categorization, ML: Classification and Regression, ML: Deep Learning Algorithms, ML: Deep Neural Architectures and Foundation Models

Abstract

Neuromorphic object recognition with spiking neural networks (SNNs) is the cornerstone of low-power neuromorphic computing. However, existing SNNs suffer from significant latency, utilizing 10 to 40 timesteps or more, to recognize neuromorphic objects. At low latencies, the performance of existing SNNs is drastically degraded. In this work, we propose the Shrinking SNN (SSNN) to achieve low-latency neuromorphic object recognition without reducing performance. Concretely, we alleviate the temporal redundancy in SNNs by dividing SNNs into multiple stages with progressively shrinking timesteps, which significantly reduces the inference latency. During timestep shrinkage, the temporal transformer smoothly transforms the temporal scale and preserves the information maximally. Moreover, we add multiple early classifiers to the SNN during training to mitigate the mismatch between the surrogate gradient and the true gradient, as well as the gradient vanishing/exploding, thus eliminating the performance degradation at low latency. Extensive experiments on neuromorphic datasets, CIFAR10-DVS, N-Caltech101, and DVS-Gesture have revealed that SSNN is able to improve the baseline accuracy by 6.55% ~ 21.41%. With only 5 average timesteps and without any data augmentation, SSNN is able to achieve an accuracy of 73.63% on CIFAR10-DVS. This work presents a heterogeneous temporal scale SNN and provides valuable insights into the development of high-performance, low-latency SNNs.

Published

2024-03-24

How to Cite

Ding, Y., Zuo, L., Jing, M., He, P., & Xiao, Y. (2024). Shrinking Your TimeStep: Towards Low-Latency Neuromorphic Object Recognition with Spiking Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 38(10), 11811-11819. https://doi.org/10.1609/aaai.v38i10.29066

Issue

Section

AAAI Technical Track on Machine Learning I