Integrating Large Language Models and Möbius Group Transformations for Temporal Knowledge Graph Embedding on the Riemann Sphere

Authors

  • Sensen Zhang School of Information, Renmin University of China
  • Xun Liang School of Information, Renmin University of China
  • Simin Niu School of Information, Renmin University of China
  • Zhendong Niu Beijing Institute of Technology
  • Bo Wu Xiangjiang Laboratory, Central South University
  • Gengxin Hua Beijing Institute of Control Engineering
  • Long Wang Western Xia Research Institute, Ningxia University,
  • Zhenyu Guan School of Information, Renmin University of China
  • Hanyu Wang School of Information, Renmin University of China
  • Xuan Zhang Harvest Fund Management Co., Ltd.
  • Zhiyu Li Shanghai Algorithm Innovation Research Institute
  • Yuefeng Ma School of Computer, Qufu Normal University

DOI:

https://doi.org/10.1609/aaai.v39i12.33449

Abstract

The significance of Temporal Knowledge Graphs (TKGs) in Artificial Intelligence (AI) lies in their capacity to incorporate time-dimensional information, support complex reasoning and prediction, optimize decision-making processes, enhance the accuracy of recommendation systems, promote multimodal data integration, and strengthen knowledge management and updates. This provides a robust foundation for various AI applications. To effectively learn and apply both static and dynamic temporal patterns for reasoning, a range of embedding methods and large language models (LLMs) have been proposed in the literature. However, these methods often rely on a single underlying embedding space, whose geometric properties severely limit their ability to model intricate temporal patterns, such as hierarchical and ring structures. To address this limitation, this paper proposes embedding TKGs into projective geometric space and leverages LLMs technology to extract crucial temporal node information, thereby constructing the 5EL model. By embedding TKGs into projective geometric space and utilizing Möbius Group transformations, we effectively model various temporal patterns. Subsequently, LLMs technology is employed to process the trained TKGs. We adopt a parameter-efficient fine-tuning strategy to align LLMs with specific task requirements, thereby enhancing the model's ability to recognize structural information of key nodes in historical chains and enriching the representation of central entities. Experimental results on five advanced TKG datasets demonstrate that our proposed 5EL model significantly outperforms existing models.

Downloads

Published

2025-04-11

How to Cite

Zhang, S., Liang, X., Niu, S., Niu, Z., Wu, B., Hua, G., … Ma, Y. (2025). Integrating Large Language Models and Möbius Group Transformations for Temporal Knowledge Graph Embedding on the Riemann Sphere. Proceedings of the AAAI Conference on Artificial Intelligence, 39(12), 13277–13285. https://doi.org/10.1609/aaai.v39i12.33449

Issue

Section

AAAI Technical Track on Data Mining & Knowledge Management II