Knowledge-Graph Augmented Word Representations for Named Entity Recognition


  • Qizhen He Bilibili
  • Liang Wu Bilibili
  • Yida Yin Bilibili
  • Heming Cai Bilibili



By modeling the context information, ELMo and BERT have successfully improved the state-of-the-art of word representation, and demonstrated their effectiveness on the Named Entity Recognition task. In this paper, in addition to such context modeling, we propose to encode the prior knowledge of entities from an external knowledge base into the representation, and introduce a Knowledge-Graph Augmented Word Representation or KAWR for named entity recognition. Basically, KAWR provides a kind of knowledge-aware representation for words by 1) encoding entity information from a pre-trained KG embedding model with a new recurrent unit (GERU), and 2) strengthening context modeling from knowledge wise by providing a relation attention scheme based on the entity relations defined in KG. We demonstrate that KAWR, as an augmented version of the existing linguistic word representations, promotes F1 scores on 5 datasets in various domains by +0.46∼+2.07. Better generalization is also observed for KAWR on new entities that cannot be found in the training sets.




How to Cite

He, Q., Wu, L., Yin, Y., & Cai, H. (2020). Knowledge-Graph Augmented Word Representations for Named Entity Recognition. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 7919-7926.



AAAI Technical Track: Natural Language Processing