Leveraging Asynchronous Spiking Neural Networks for Ultra Efficient Event-Based Visual Processing

Authors

  • DingYi Zeng University of Electronic Science and Technology of China
  • Yuchen Wang University of Electronic Science and Technology of China
  • Honglin Cao University of Electronic Science and Technology of China
  • Wanlong Liu University of Electronic Science and Technology of China
  • Yichen Xiao University of Electronic Science and Technology of China
  • ChengzhuoLu University of Electronic Science and Technology of China
  • Wenyu Chen University of Electronic Science and Technology of China
  • Malu Zhang University of Electronic Science and Technology of China
  • Guoqing Wang University of Electronic Science and Technology of China
  • Yang Yang University of Electronic Science and Technology of China

DOI:

https://doi.org/10.1609/aaai.v39i2.32154

Abstract

Event cameras encode visual information by generating asynchronous and sparse event streams, which hold great potential for low latency and low power consumption. Despite many successful implementations of event camera-based applications, most of them accumulate the events into frames and then utilize conventional frame-based computer vision algorithms. These frame-based methods, though typically effective, diminish the inherent advantages of the event camera's low latency and low power consumption. To solve the above problems, we propose ASGCN, which efficiently processes data on an event-by-event basis and dynamically evolves into a corresponding dynamic representation, enabling low latency and high sparsity of data representation. The sparsity computation is further improved by introducing brain-inspired spiking neural networks, resulting in low power consumption for ASGCN. Extensive and diverse experiments demonstrate the energy efficiency and low latency advantages of our processing pipeline. Especially on real-world event camera datasets, our pipeline consumes more than 10,000 times less energy and achieves similar performance compared to current frame-based methods.

Downloads

Published

2025-04-11

How to Cite

Zeng, D., Wang, Y., Cao, H., Liu, W., Xiao, Y., , C., … Yang, Y. (2025). Leveraging Asynchronous Spiking Neural Networks for Ultra Efficient Event-Based Visual Processing. Proceedings of the AAAI Conference on Artificial Intelligence, 39(2), 1620–1628. https://doi.org/10.1609/aaai.v39i2.32154

Issue

Section

AAAI Technical Track on Cognitive Modeling & Cognitive Systems