Improving Complex Reasoning over Knowledge Graph with Logic-Aware Curriculum Tuning

Authors

  • Tianle Xia Wuhan University
  • Liang Ding The University of Sydney
  • Guojia Wan Wuhan University
  • Yibing Zhan JD Explore Academy
  • Bo Du Wuhan University
  • Dacheng Tao Nanyang Technological University

DOI:

https://doi.org/10.1609/aaai.v39i12.33405

Abstract

Answering complex queries over incomplete knowledge graphs (KGs) is a challenging job. Most previous works have focused on learning entity/relation embeddings and simulating first-order logic operators with various neural networks. However, they are bottlenecked by the inability to share world knowledge to improve logical reasoning, thus resulting in suboptimal performance. In this paper, we propose a complex reasoning schema over KG upon large language models (LLMs), containing a curriculum-based logical-aware instruction tuning framework, named LACT. Specifically, we augment the arbitrary first-order logical queries via binary tree decomposition, to stimulate the reasoning capability of LLMs. To address the difficulty gap among different types of complex queries, we design a simple and flexible logic-aware curriculum learning framework. Experiments across widely used datasets demonstrate that LACT has substantial improvements~(brings an average +5.5% MRR score) over advanced methods, achieving the new state-of-the-art.

Downloads

Published

2025-04-11

How to Cite

Xia, T., Ding, L., Wan, G., Zhan, Y., Du, B., & Tao, D. (2025). Improving Complex Reasoning over Knowledge Graph with Logic-Aware Curriculum Tuning. Proceedings of the AAAI Conference on Artificial Intelligence, 39(12), 12881–12889. https://doi.org/10.1609/aaai.v39i12.33405

Issue

Section

AAAI Technical Track on Data Mining & Knowledge Management II