Continual Relation Extraction via Sequential Multi-Task Learning


  • Thanh-Thien Le VinAI Research, Vietnam
  • Manh Nguyen Hanoi University of Science and Technology, Vietnam
  • Tung Thanh Nguyen University of Michigan, USA
  • Linh Ngo Van Hanoi University of Science and Technology, Vietnam
  • Thien Huu Nguyen University of Oregon, USA



NLP: Information Extraction, NLP: Learning & Optimization for NLP


To build continual relation extraction (CRE) models, those can adapt to an ever-growing ontology of relations, is a cornerstone information extraction task that serves in various dynamic real-world domains. To mitigate catastrophic forgetting in CRE, existing state-of-the-art approaches have effectively utilized rehearsal techniques from continual learning and achieved remarkable success. However, managing multiple objectives associated with memory-based rehearsal remains underexplored, often relying on simple summation and overlooking complex trade-offs. In this paper, we propose Continual Relation Extraction via Sequential Multi-task Learning (CREST), a novel CRE approach built upon a tailored Multi-task Learning framework for continual learning. CREST takes into consideration the disparity in the magnitudes of gradient signals of different objectives, thereby effectively handling the inherent difference between multi-task learning and continual learning. Through extensive experiments on multiple datasets, CREST demonstrates significant improvements in CRE performance as well as superiority over other state-of-the-art Multi-task Learning frameworks, offering a promising solution to the challenges of continual learning in this domain.



How to Cite

Le, T.-T., Nguyen, M., Nguyen, T. T., Ngo Van, L., & Nguyen, T. H. (2024). Continual Relation Extraction via Sequential Multi-Task Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 38(16), 18444-18452.



AAAI Technical Track on Natural Language Processing I