Learning Entity and Relation Embeddings for Knowledge Graph Completion

Authors

  • Yankai Lin Tsinghua University
  • Zhiyuan Liu Tsinghua University
  • Maosong Sun Tsinghua University
  • Yang Liu Samsung Research and Development Institute of China
  • Xuan Zhu Samsung Research and Development Institute of China

DOI:

https://doi.org/10.1609/aaai.v29i1.9491

Keywords:

knowledge graph embedding, knowledge graph completion, relation extraction, knolwedge representation

Abstract

Knowledge graph completion aims to perform link prediction between entities. In this paper, we consider the approach of knowledge graph embeddings. Recently, models such as TransE and TransH build entity and relation embeddings by regarding a relation as translation from head entity to tail entity. We note that these models simply put both entities and relations within the same semantic space. In fact, an entity may have multiple aspects and various relations may focus on different aspects of entities, which makes a common space insufficient for modeling. In this paper, we propose TransR to build entity and relation embeddings in separate entity space and relation spaces. Afterwards, we learn embeddings by first projecting entities from entity space to corresponding relation space and then building translations between projected entities. In experiments, we evaluate our models on three tasks including link prediction, triple classification and relational fact extraction. Experimental results show significant and consistent improvements compared to state-of-the-art baselines including TransE and TransH.

Downloads

Published

2015-02-19

How to Cite

Lin, Y., Liu, Z., Sun, M., Liu, Y., & Zhu, X. (2015). Learning Entity and Relation Embeddings for Knowledge Graph Completion. Proceedings of the AAAI Conference on Artificial Intelligence, 29(1). https://doi.org/10.1609/aaai.v29i1.9491

Issue

Section

Main Track: NLP and Knowledge Representation