Inductive Relation Prediction by BERT


  • Hanwen Zha University of California, Santa Barbara
  • Zhiyu Chen University of California, Santa Barbara
  • Xifeng Yan University of California, Santa Barbara



Knowledge Representation And Reasoning (KRR), Speech & Natural Language Processing (SNLP)


Relation prediction in knowledge graphs is dominated by embedding based methods which mainly focus on the transductive setting. Unfortunately, they are not able to handle inductive learning where unseen entities and relations are present and cannot take advantage of prior knowledge. Furthermore, their inference process is not easily explainable. In this work, we propose an all-in-one solution, called BERTRL (BERT-based Relational Learning), which leverages pre-trained language model and fine-tunes it by taking relation instances and their possible reasoning paths as training samples. BERTRL outperforms the SOTAs in 15 out of 18 cases in both inductive and transductive settings. Meanwhile, it demonstrates strong generalization capability in few-shot learning and is explainable. The data and code can be found at




How to Cite

Zha, H., Chen, Z., & Yan, X. (2022). Inductive Relation Prediction by BERT. Proceedings of the AAAI Conference on Artificial Intelligence, 36(5), 5923-5931.



AAAI Technical Track on Knowledge Representation and Reasoning