Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions

Authors

  • Guoliang Ji National Laboratory of Pattern Recognition (NLPR), Institute of Automation Chinese Academy of Sciences
  • Kang Liu National Laboratory of Pattern Recognition (NLPR), Institute of Automation Chinese Academy of Sciences
  • Shizhu He National Laboratory of Pattern Recognition (NLPR), Institute of Automation Chinese Academy of Sciences
  • Jun Zhao National Laboratory of Pattern Recognition (NLPR), Institute of Automation Chinese Academy of Sciences

DOI:

https://doi.org/10.1609/aaai.v31i1.10953

Abstract

Distant supervision for relation extraction is an efficient method to scale relation extraction to very large corpora which contains thousands of relations. However, the existing approaches have flaws on selecting valid instances and lack of background knowledge about the entities. In this paper, we propose a sentence-level attention model to select the valid instances, which makes full use of the supervision information from knowledge bases. And we extract entity descriptions from Freebase and Wikipedia pages to supplement background knowledge for our task. The background knowledge not only provides more information for predicting relations, but also brings better entity representations for the attention module. We conduct three experiments on a widely used dataset and the experimental results show that our approach outperforms all the baseline systems significantly.

Downloads

Published

2017-02-12

How to Cite

Ji, G., Liu, K., He, S., & Zhao, J. (2017). Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10953

Issue

Section

Main Track: NLP and Knowledge Representation