Spiking Neural Networks with Improved Inherent Recurrence Dynamics for Sequential Learning

Authors

  • Wachirawit Ponghiran Purdue University, West Lafayette, IN
  • Kaushik Roy Purdue University, West Lafayette, IN

DOI:

https://doi.org/10.1609/aaai.v36i7.20771

Keywords:

Machine Learning (ML), Speech & Natural Language Processing (SNLP)

Abstract

Spiking neural networks (SNNs) with leaky integrate and fire (LIF) neurons, can be operated in an event-driven manner and have internal states to retain information over time, providing opportunities for energy-efficient neuromorphic computing, especially on edge devices. Note, however, many representative works on SNNs do not fully demonstrate the usefulness of their inherent recurrence (membrane potential retaining information about the past) for sequential learning. Most of the works train SNNs to recognize static images by artificially expanded input representation in time through rate coding. We show that SNNs can be trained for practical sequential tasks by proposing modifications to a network of LIF neurons that enable internal states to learn long sequences and make their inherent recurrence resilient to the vanishing gradient problem. We then develop a training scheme to train the proposed SNNs with improved inherent recurrence dynamics. Our training scheme allows spiking neurons to produce multi-bit outputs (as opposed to binary spikes) which help mitigate the mismatch between a derivative of spiking neurons' activation function and a surrogate derivative used to overcome spiking neurons' non-differentiability. Our experimental results indicate that the proposed SNN architecture on TIMIT and LibriSpeech 100h speech recognition dataset yields accuracy comparable to that of LSTMs (within 1.10% and 0.36%, respectively), but with 2x fewer parameters than LSTMs. The sparse SNN outputs also lead to 10.13x and 11.14x savings in multiplication operations compared to GRUs, which are generally considered as a lightweight alternative to LSTMs, on TIMIT and LibriSpeech 100h datasets, respectively.

Downloads

Published

2022-06-28

How to Cite

Ponghiran, W., & Roy, K. (2022). Spiking Neural Networks with Improved Inherent Recurrence Dynamics for Sequential Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 36(7), 8001-8008. https://doi.org/10.1609/aaai.v36i7.20771

Issue

Section

AAAI Technical Track on Machine Learning II