Progressive Multi-task Learning with Controlled Information Flow for Joint Entity and Relation Extraction

Authors

  • Kai Sun Beihang University
  • Richong Zhang Beihang University
  • Samuel Mensah Beihang University
  • Yongyi Mao University of Ottawa
  • Xudong Liu Beihang University

Keywords:

Information Extraction

Abstract

Multitask learning has shown promising performance in learning multiple related tasks simultaneously, and variants of model architectures have been proposed, especially for supervised classification problems. One goal of multitask learning is to extract a good representation that sufficiently captures the relevant part of the input about the output for each learning task. To achieve this objective, in this paper we design a multitask learning architecture based on the observation that correlations exist between outputs of some related tasks (e.g. entity recognition and relation extraction tasks), and they reflect the relevant features that need to be extracted from the input. As outputs are unobserved, our proposed model exploits task predictions in lower layers of the neural model, also referred to as early predictions in this work. But we control the injection of early predictions to ensure that we extract good task-specific representations for classification. We refer to this model as a Progressive Multitask learning model with Explicit Interactions (PMEI). Extensive experiments on multiple benchmark datasets produce state-of-the-art results on the joint entity and relation extraction task.

Downloads

Published

2021-05-18

How to Cite

Sun, K., Zhang, R., Mensah, S., Mao, Y., & Liu, X. (2021). Progressive Multi-task Learning with Controlled Information Flow for Joint Entity and Relation Extraction. Proceedings of the AAAI Conference on Artificial Intelligence, 35(15), 13851-13859. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17632

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing II