Entity Structure Within and Throughout: Modeling Mention Dependencies for Document-Level Relation Extraction

Authors

  • Benfeng Xu University of Science and Technology of China
  • Quan Wang Baidu Inc.
  • Yajuan Lyu Baidu Inc.
  • Yong Zhu Baidu Inc.
  • Zhendong Mao University of Science and Technology of China

DOI:

https://doi.org/10.1609/aaai.v35i16.17665

Keywords:

Information Extraction

Abstract

Entities, as the essential elements in relation extraction tasks, exhibit certain structure. In this work, we formulate such entity structure as distinctive dependencies between mention pairs. We then propose SSAN, which incorporates these structural dependencies within the standard self-attention mechanism and throughout the overall encoding stage. Specifically, we design two alternative transformation modules inside each self-attention building block to produce attentive biases so as to adaptively regularize its attention flow. Our experiments demonstrate the usefulness of the proposed entity structure and the effectiveness of SSAN. It significantly outperforms competitive baselines, achieving new state-of-the-art results on three popular document-level relation extraction datasets. We further provide ablation and visualization to show how the entity structure guides the model for better relation extraction. Our code is publicly available.

Downloads

Published

2021-05-18

How to Cite

Xu, B., Wang, Q., Lyu, Y., Zhu, Y., & Mao, Z. (2021). Entity Structure Within and Throughout: Modeling Mention Dependencies for Document-Level Relation Extraction. Proceedings of the AAAI Conference on Artificial Intelligence, 35(16), 14149-14157. https://doi.org/10.1609/aaai.v35i16.17665

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing III