Effective Continual Learning for Text Classification with Lightweight Snapshots

Authors

  • Jue Wang Zhejiang University
  • Dajie Dong Zhejiang University
  • Lidan Shou Zhejiang University
  • Ke Chen Zhejiang University
  • Gang Chen Zhejiang University

DOI:

https://doi.org/10.1609/aaai.v37i8.26206

Keywords:

ML: Lifelong and Continual Learning, SNLP: Text Classification

Abstract

Continual learning is known for suffering from catastrophic forgetting, a phenomenon where previously learned concepts are forgotten upon learning new tasks. A natural remedy is to use trained models for old tasks as ‘teachers’ to regularize the update of the current model to prevent such forgetting. However, this requires storing all past models, which is very space-consuming for large models, e.g. BERT, thus impractical in real-world applications. To tackle this issue, we propose to construct snapshots of seen tasks whose key knowledge is captured in lightweight adapters. During continual learning, we transfer knowledge from past snapshots to the current model through knowledge distillation, allowing the current model to review previously learned knowledge while learning new tasks. We also design representation recalibration to better handle the class-incremental setting. Experiments over various task sequences show that our approach effectively mitigates catastrophic forgetting and outperforms all baselines.

Downloads

Published

2023-06-26

How to Cite

Wang, J., Dong, D., Shou, L., Chen, K., & Chen, G. (2023). Effective Continual Learning for Text Classification with Lightweight Snapshots. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 10122-10130. https://doi.org/10.1609/aaai.v37i8.26206

Issue

Section

AAAI Technical Track on Machine Learning III