Pseudo-Spiking Neurons: A Noise-Based Training Framework for Heterogeneous-Latency Spiking Neural Networks

Authors

  • Yuxuan Zhang Beihang University
  • Yuhang Sun Beihang University Beijing Zhongguancun Academy
  • Hongjue Li Beihang University
  • Yue Deng Beihang University Beijing Zhongguancun Academy
  • Wen Yao Defense Innovation Institute, Chinese Academy of Military Science

DOI:

https://doi.org/10.1609/aaai.v40i34.40086

Abstract

Spiking Neural Networks (SNNs) promise significant energy efficiency by processing information via sparse, event-driven spikes. However, realizing this potential is hindered by the conventional use of a rigid, uniform timestep, T. This constraint imposes a challenging trade-off between accuracy and latency, while also incurring the prohibitive training costs of Backpropagation Through Time (BPTT). To overcome this limitation, we introduce the Pseudo-Spiking Neuron (PseudoSN), a novel training proxy that conceptualizes latency as an intrinsic, learnable parameter for each neuron. Building on the efficiency of rate-based methods, the PseudoSN models temporal dynamics in a single, BPTT-free pass. It employs a learnable probabilistic noise scheme to emulate the discretization effects of spike generation (e.g., clipping and quantization), making the neuron-specific timestep—and thus latency—directly optimizable via backpropagation. Integrated into a hardware-aware objective, our framework trains heterogeneous-latency SNNs that autonomously learn to optimize the trade-offs among accuracy, latency and energy, establishing a new state-of-the-art on major benchmarks.

Downloads

Published

2026-03-14

How to Cite

Zhang, Y., Sun, Y., Li, H., Deng, Y., & Yao, W. (2026). Pseudo-Spiking Neurons: A Noise-Based Training Framework for Heterogeneous-Latency Spiking Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 40(34), 28555-28563. https://doi.org/10.1609/aaai.v40i34.40086

Issue

Section

AAAI Technical Track on Machine Learning XI