Relation Extraction Exploiting Full Dependency Forests

Authors

  • Lifeng Jin The Ohio State University
  • Linfeng Song Tencent AI Lab
  • Yue Zhang Westlake Institute for Advanced Study
  • Kun Xu Tencent AI Lab
  • Wei-Yun Ma Academia Sinica
  • Dong Yu Tencent AI Lab

DOI:

https://doi.org/10.1609/aaai.v34i05.6313

Abstract

Dependency syntax has long been recognized as a crucial source of features for relation extraction. Previous work considers 1-best trees produced by a parser during preprocessing. However, error propagation from the out-of-domain parser may impact the relation extraction performance. We propose to leverage full dependency forests for this task, where a full dependency forest encodes all possible trees. Such representations of full dependency forests provide a differentiable connection between a parser and a relation extraction model, and thus we are also able to study adjusting the parser parameters based on end-task loss. Experiments on three datasets show that full dependency forests and parser adjustment give significant improvements over carefully designed baselines, showing state-of-the-art or competitive performances on biomedical or newswire benchmarks.

Downloads

Published

2020-04-03

How to Cite

Jin, L., Song, L., Zhang, Y., Xu, K., Ma, W.-Y., & Yu, D. (2020). Relation Extraction Exploiting Full Dependency Forests. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 8034-8041. https://doi.org/10.1609/aaai.v34i05.6313

Issue

Section

AAAI Technical Track: Natural Language Processing