SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation

Authors

  • Malyaban Bal The Pennsylvania State University
  • Abhronil Sengupta The Pennsylvania State University

DOI:

https://doi.org/10.1609/aaai.v38i10.28975

Keywords:

ML: Bio-inspired Learning, CMS: Other Foundations of Cognitive Modeling & Systems

Abstract

Large language Models (LLMs), though growing exceedingly powerful, comprises of orders of magnitude less neurons and synapses than the human brain. However, it requires significantly more power/energy to operate. In this work, we propose a novel bio-inspired spiking language model (LM) which aims to reduce the computational cost of conventional LMs by drawing motivation from the synaptic information flow in the brain. In this paper, we demonstrate a framework that leverages the average spiking rate of neurons at equilibrium to train a neuromorphic spiking LM using implicit differentiation technique, thereby overcoming the non-differentiability problem of spiking neural network (SNN) based algorithms without using any type of surrogate gradient. The steady-state convergence of the spiking neurons also allows us to design a spiking attention mechanism, which is critical in developing a scalable spiking LM. Moreover, the convergence of average spiking rate of neurons at equilibrium is utilized to develop a novel ANN-SNN knowledge distillation based technique wherein we use a pre-trained BERT model as “teacher” to train our “student” spiking architecture. While the primary architecture proposed in this paper is motivated by BERT, the technique can be potentially extended to different kinds of LLMs. Our work is the first one to demonstrate the performance of an operational spiking LM architecture on multiple different tasks in the GLUE benchmark. Our implementation source code is available at https://github.com/NeuroCompLab-psu/SpikingBERT.

Published

2024-03-24

How to Cite

Bal, M. ., & Sengupta, A. (2024). SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation. Proceedings of the AAAI Conference on Artificial Intelligence, 38(10), 10998-11006. https://doi.org/10.1609/aaai.v38i10.28975

Issue

Section

AAAI Technical Track on Machine Learning I