Enhancing the Robustness of Spiking Neural Networks with Stochastic Gating Mechanisms
DOI:
https://doi.org/10.1609/aaai.v38i1.27804Keywords:
CMS: (Computational) Cognitive Architectures, CMS: Neural Spike CodingAbstract
Spiking neural networks (SNNs) exploit neural spikes to provide solutions for low-power intelligent applications on neuromorphic hardware. Although SNNs have high computational efficiency due to spiking communication, they still lack resistance to adversarial attacks and noise perturbations. In the brain, neuronal responses generally possess stochasticity induced by ion channels and synapses, while the role of stochasticity in computing tasks is poorly understood. Inspired by this, we elaborate a stochastic gating spiking neural model for layer-by-layer spike communication, introducing stochasticity to SNNs. Through theoretical analysis, our gating model can be viewed as a regularizer that prevents error amplification under attacks. Meanwhile, our work can explain the robustness of Poisson coding. Experimental results prove that our method can be used alone or with existing robust enhancement algorithms to improve SNN robustness and reduce SNN energy consumption. We hope our work will shed new light on the role of stochasticity in the computation of SNNs. Our code is available at https://github.com/DingJianhao/StoG-meets-SNN/.Downloads
Published
2024-03-25
How to Cite
Ding, J., Yu, Z., Huang, T., & Liu, J. K. (2024). Enhancing the Robustness of Spiking Neural Networks with Stochastic Gating Mechanisms. Proceedings of the AAAI Conference on Artificial Intelligence, 38(1), 492-502. https://doi.org/10.1609/aaai.v38i1.27804
Issue
Section
AAAI Technical Track on Cognitive Modeling & Cognitive Systems