TY - JOUR AU - Zhou, Shibo AU - Li, Xiaohua AU - Chen, Ying AU - Chandrasekaran, Sanjeev T. AU - Sanyal, Arindam PY - 2021/05/18 Y2 - 2024/03/28 TI - Temporal-Coded Deep Spiking Neural Network with Easy Training and Robust Performance JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 35 IS - 12 SE - AAAI Technical Track on Machine Learning V DO - 10.1609/aaai.v35i12.17329 UR - https://ojs.aaai.org/index.php/AAAI/article/view/17329 SP - 11143-11151 AB - Spiking neural network (SNN) is promising but the development has fallen far behind conventional deep neural networks (DNNs) because of difficult training. To resolve the training problem, we analyze the closed-form input-output response of spiking neurons and use the response expression to build abstract SNN models for training. This avoids calculating membrane potential during training and makes the direct training of SNN as efficient as DNN. We show that the nonleaky integrate-and-fire neuron with single-spike temporal-coding is the best choice for direct-train deep SNNs. We develop an energy-efficient phase-domain signal processing circuit for the neuron and propose a direct-train deep SNN framework. Thanks to easy training, we train deep SNNs under weight quantizations to study their robustness over low-cost neuromorphic hardware. Experiments show that our direct-train deep SNNs have the highest CIFAR-10 classification accuracy among SNNs, achieve ImageNet classification accuracy within 1% of the DNN of equivalent architecture, and are robust to weight quantization and noise perturbation. ER -