Temporal-Coded Deep Spiking Neural Network with Easy Training and Robust Performance

Authors

  • Shibo Zhou Binghamton University
  • Xiaohua Li Binghamton University
  • Ying Chen Harbin Institute of Technology
  • Sanjeev T. Chandrasekaran University at Buffalo-SUNY
  • Arindam Sanyal University at Buffalo-SUNY

Keywords:

Bio-inspired Learning

Abstract

Spiking neural network (SNN) is promising but the development has fallen far behind conventional deep neural networks (DNNs) because of difficult training. To resolve the training problem, we analyze the closed-form input-output response of spiking neurons and use the response expression to build abstract SNN models for training. This avoids calculating membrane potential during training and makes the direct training of SNN as efficient as DNN. We show that the nonleaky integrate-and-fire neuron with single-spike temporal-coding is the best choice for direct-train deep SNNs. We develop an energy-efficient phase-domain signal processing circuit for the neuron and propose a direct-train deep SNN framework. Thanks to easy training, we train deep SNNs under weight quantizations to study their robustness over low-cost neuromorphic hardware. Experiments show that our direct-train deep SNNs have the highest CIFAR-10 classification accuracy among SNNs, achieve ImageNet classification accuracy within 1% of the DNN of equivalent architecture, and are robust to weight quantization and noise perturbation.

Downloads

Published

2021-05-18

How to Cite

Zhou, S., Li, X., Chen, Y., Chandrasekaran, S. T., & Sanyal, A. (2021). Temporal-Coded Deep Spiking Neural Network with Easy Training and Robust Performance. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 11143-11151. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17329

Issue

Section

AAAI Technical Track on Machine Learning V