Prototypical Replay with Old-class Focusing Knowledge Distillation for Incremental Named Entity Recognition
DOI:
https://doi.org/10.1609/aaai.v39i23.34651Abstract
Catastrophic forgetting is a key challenge in incremental named entity recognition (INER). Existing methods often address this issue through distillation-based approaches, which involve transferring previously learned knowledge from the old model to the new one. However, these methods may not fully equip the new model with an adequate understanding of the characteristics about old entity types, leading to confusion when classifying tokens associated with these entity types. To address this challenge, we propose a novel method called Prototypical Replay with Old-class Focusing Knowledge Distillation (POF) for INER. Our approach focuses on preserving the main characteristics of each previous entity type by storing compact prototypes and replaying them with appropriate frequency. This replay strategy makes the new model review the knowledge of old entity types while minimizing storage needs. Additionally, we introduce an old-class focusing knowledge distillation (OFKD) loss, which distills features only in old-class regions to maintain the quality of old-class prototypes and prevent ineffective prototypical replay while preserving sufficient plasticity for learning new entity types. We conducted experiments on three benchmark datasets (i.e., Few-NERD, I2B2 and OntoNotes5), and the results demonstrate that our method outperforms all previous state-of-the-art methods.Published
2025-04-11
How to Cite
Liu, Z., Zhu, Q., Li, C., & Chen, H. (2025). Prototypical Replay with Old-class Focusing Knowledge Distillation for Incremental Named Entity Recognition. Proceedings of the AAAI Conference on Artificial Intelligence, 39(23), 24705–24713. https://doi.org/10.1609/aaai.v39i23.34651
Issue
Section
AAAI Technical Track on Natural Language Processing II