Improving Neural Fine-Grained Entity Typing With Knowledge Attention

Authors

  • Ji Xin Tsinghua University
  • Yankai Lin Tsinghua University
  • Zhiyuan Liu Tsinghua University
  • Maosong Sun Tsinghua University

DOI:

https://doi.org/10.1609/aaai.v32i1.12038

Keywords:

Entity Typing, Attention, Knowledge Base

Abstract

Fine-grained entity typing aims to identify the semantic type of an entity in a particular plain text. It is an important task which can be helpful for a lot of natural language processing (NLP) applications. Most existing methods typically extract features separately from the entity mention and context words for type classification. These methods inevitably fail to model complex correlations between entity mentions and context words. They also neglect rich background information about these entities in knowledge bases (KBs). To address these issues, we take information from KBs into consideration to bridge entity mentions and their context together, and thereby propose Knowledge-Attention Neural Fine-Grained Entity Typing. Experimental results and case studies on real-world datasets demonstrate that our model significantly outperforms other state-of-the-art methods, revealing the effectiveness of incorporating KB information for entity typing. Code and data for this paper can be found at https://github.com/thunlp/KNET.

Downloads

Published

2018-04-26

How to Cite

Xin, J., Lin, Y., Liu, Z., & Sun, M. (2018). Improving Neural Fine-Grained Entity Typing With Knowledge Attention. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.12038