Sequence Generation with Label Augmentation for Relation Extraction

Authors

  • Bo Li Peking University
  • Dingyao Yu Peking University
  • Wei Ye Peking University
  • Jinglei Zhang Peking University
  • Shikun Zhang Peking University

DOI:

https://doi.org/10.1609/aaai.v37i11.26532

Keywords:

SNLP: Information Extraction

Abstract

Sequence generation demonstrates promising performance in recent information extraction efforts, by incorporating large-scale pre-trained Seq2Seq models. This paper investigates the merits of employing sequence generation in relation extraction, finding that with relation names or synonyms as generation targets, their textual semantics and the correlation (in terms of word sequence pattern) among them affect model performance. We then propose Relation Extraction with Label Augmentation (RELA), a Seq2Seq model with automatic label augmentation for RE. By saying label augmentation, we mean prod semantically synonyms for each relation name as the generation target. Besides, we present an in-depth analysis of the Seq2Seq model's behavior when dealing with RE. Experimental results show that RELA achieves competitive results compared with previous methods on four RE datasets.

Downloads

Published

2023-06-26

How to Cite

Li, B., Yu, D., Ye, W., Zhang, J., & Zhang, S. (2023). Sequence Generation with Label Augmentation for Relation Extraction. Proceedings of the AAAI Conference on Artificial Intelligence, 37(11), 13043-13050. https://doi.org/10.1609/aaai.v37i11.26532

Issue

Section

AAAI Technical Track on Speech & Natural Language Processing