5* Knowledge Graph Embeddings with Projective Transformations

Authors

  • Mojtaba Nayyeri Smart Data Analytics Group, University of Bonn, Germany Nature-Inspired Machine Intelligence-InfAI, Dresden, Germany
  • Sahar Vahdati Nature-Inspired Machine Intelligence-InfAI, Dresden, Germany
  • Can Aykul Smart Data Analytics Group, University of Bonn, Germany
  • Jens Lehmann Smart Data Analytics Group, University of Bonn Fraunhofer IAIS, Dresden, German

DOI:

https://doi.org/10.1609/aaai.v35i10.17095

Keywords:

Relational Learning, Representation Learning

Abstract

Performing link prediction using knowledge graph embedding models has become a popular approach for knowledge graph completion. Such models employ a transformation function that maps nodes via edges into a vector space in order to measure the likelihood of the links. While mapping the individual nodes, the structure of subgraphs is also transformed. Most of the embedding models designed in Euclidean geometry usually support a single transformation type -- often translation or rotation, which is suitable for learning on graphs with small differences in neighboring subgraphs. However, multi-relational knowledge graphs often include multiple subgraph structures in a neighborhood (e.g.~combinations of path and loop structures), which current embedding models do not capture well. To tackle this problem, we propose a novel KGE model 5*E in projective geometry, which supports multiple simultaneous transformations -- specifically inversion, reflection, translation, rotation, and homothety. The model has several favorable theoretical properties and subsumes the existing approaches. It outperforms them on most widely used link prediction benchmarks

Downloads

Published

2021-05-18

How to Cite

Nayyeri, M., Vahdati, S., Aykul, C., & Lehmann, J. (2021). 5* Knowledge Graph Embeddings with Projective Transformations. Proceedings of the AAAI Conference on Artificial Intelligence, 35(10), 9064-9072. https://doi.org/10.1609/aaai.v35i10.17095

Issue

Section

AAAI Technical Track on Machine Learning III