SEE: Syntax-Aware Entity Embedding for Neural Relation Extraction

Authors

  • Zhengqiu He Soochow University
  • Wenliang Chen Soochow University
  • Zhenghua Li Soochow University
  • Meishan Zhang Heilongjiang University
  • Wei Zhang Alibaba Group
  • Min Zhang Soochow University

DOI:

https://doi.org/10.1609/aaai.v32i1.12042

Abstract

Distant supervised relation extraction is an efficient approach to scale relation extraction to very large corpora, and has been widely used to find novel relational facts from plain text. Recent studies on neural relation extraction have shown great progress on this task via modeling the sentences in low-dimensional spaces, but seldom considered syntax information to model the entities. In this paper, we propose to learn syntax-aware entity embedding for neural relation extraction. First, we encode the context of entities on a dependency tree as sentence-level entity embedding based on tree-GRU. Then, we utilize both intra-sentence and inter-sentence attentions to obtain sentence set-level entity embedding over all sentences containing the focus entity pair. Finally, we combine both sentence embedding and entity embedding for relation classification. We conduct experiments on a widely used real-world dataset and the experimental results show that our model can make full use of all informative instances and achieve state-of-the-art performance of relation extraction.

Downloads

Published

2018-04-26

How to Cite

He, Z., Chen, W., Li, Z., Zhang, M., Zhang, W., & Zhang, M. (2018). SEE: Syntax-Aware Entity Embedding for Neural Relation Extraction. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.12042