Relational Graph Neural Network with Hierarchical Attention for Knowledge Graph Completion


  • Zhao Zhang Chinese Academy of Sciences
  • Fuzhen Zhuang Chinese Academy of Sciences
  • Hengshu Zhu Baidu Inc.
  • Zhiping Shi Capital Normal University
  • Hui Xiong Baidu Inc.
  • Qing He Chinese Academy of Sciences



The rapid proliferation of knowledge graphs (KGs) has changed the paradigm for various AI-related applications. Despite their large sizes, modern KGs are far from complete and comprehensive. This has motivated the research in knowledge graph completion (KGC), which aims to infer missing values in incomplete knowledge triples. However, most existing KGC models treat the triples in KGs independently without leveraging the inherent and valuable information from the local neighborhood surrounding an entity. To this end, we propose a Relational Graph neural network with Hierarchical ATtention (RGHAT) for the KGC task. The proposed model is equipped with a two-level attention mechanism: (i) the first level is the relation-level attention, which is inspired by the intuition that different relations have different weights for indicating an entity; (ii) the second level is the entity-level attention, which enables our model to highlight the importance of different neighboring entities under the same relation. The hierarchical attention mechanism makes our model more effective to utilize the neighborhood information of an entity. Finally, we extensively validate the superiority of RGHAT against various state-of-the-art baselines.




How to Cite

Zhang, Z., Zhuang, F., Zhu, H., Shi, Z., Xiong, H., & He, Q. (2020). Relational Graph Neural Network with Hierarchical Attention for Knowledge Graph Completion. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 9612-9619.



AAAI Technical Track: Natural Language Processing