Inductive Relation Prediction by BERT

Authors

  • Hanwen Zha University of California, Santa Barbara
  • Zhiyu Chen University of California, Santa Barbara
  • Xifeng Yan University of California, Santa Barbara

DOI:

https://doi.org/10.1609/aaai.v36i5.20537

Keywords:

Knowledge Representation And Reasoning (KRR), Speech & Natural Language Processing (SNLP)

Abstract

Relation prediction in knowledge graphs is dominated by embedding based methods which mainly focus on the transductive setting. Unfortunately, they are not able to handle inductive learning where unseen entities and relations are present and cannot take advantage of prior knowledge. Furthermore, their inference process is not easily explainable. In this work, we propose an all-in-one solution, called BERTRL (BERT-based Relational Learning), which leverages pre-trained language model and fine-tunes it by taking relation instances and their possible reasoning paths as training samples. BERTRL outperforms the SOTAs in 15 out of 18 cases in both inductive and transductive settings. Meanwhile, it demonstrates strong generalization capability in few-shot learning and is explainable. The data and code can be found at https://github.com/zhw12/BERTRL.

Downloads

Published

2022-06-28

How to Cite

Zha, H., Chen, Z., & Yan, X. (2022). Inductive Relation Prediction by BERT. Proceedings of the AAAI Conference on Artificial Intelligence, 36(5), 5923-5931. https://doi.org/10.1609/aaai.v36i5.20537

Issue

Section

AAAI Technical Track on Knowledge Representation and Reasoning