SR-FoT: A Syllogistic-Reasoning Framework of Thought for Large Language Models Tackling Knowledge-based Reasoning Tasks

Authors

  • Wentao Wan School of Computer Science and Engineering, Sun Yat-sen University
  • Zhuojie Yang School of Computer Science and Engineering, Sun Yat-sen University
  • Yongcan Chen South China Normal University
  • Chenglin Luo School of Computer Science and Engineering, Sun Yat-sen University
  • Ruilin Wang School of Computer Science and Engineering, Sun Yat-sen University
  • Kehao Cai School of Computer Science and Engineering, Sun Yat-sen University
  • Nan Kang School of Computer Science and Engineering, Sun Yat-sen University
  • Liang Lin School of Computer Science and Engineering, Sun Yat-sen University
  • Keze Wang School of Computer Science and Engineering, Sun Yat-sen University Guangdong Key Laboratory of Big Data Analysis and Processing

DOI:

https://doi.org/10.1609/aaai.v39i14.33666

Abstract

Deductive reasoning is a crucial logical capability that assists us in solving complex problems based on existing knowledge. Although augmented by Chain-of-Thought prompts, Large Language Models (LLMs) might not follow the correct reasoning paths. Enhancing the deductive reasoning abilities of LLMs, and leveraging their extensive built-in knowledge for various reasoning tasks, remains an open question. Attempting to mimic the human deductive reasoning paradigm, we propose a multi-stage Syllogistic-Reasoning Framework of Thought (SR-FoT) that enables LLMs to perform syllogistic deductive reasoning to handle complex knowledge-based reasoning tasks. Our SR-FoT begins by interpreting the question and then uses the interpretation and the original question to propose a suitable major premise. It proceeds by generating and answering minor premise questions in two stages to match the minor premises. Finally, it guides LLMs to use the previously generated major and minor premises to perform syllogistic deductive reasoning to derive the answer to the original question. Extensive and thorough experiments on knowledge-based reasoning tasks have demonstrated the effectiveness and advantages of our SR-FoT.

Downloads

Published

2025-04-11

How to Cite

Wan, W., Yang, Z., Chen, Y., Luo, C., Wang, R., Cai, K., … Wang, K. (2025). SR-FoT: A Syllogistic-Reasoning Framework of Thought for Large Language Models Tackling Knowledge-based Reasoning Tasks. Proceedings of the AAAI Conference on Artificial Intelligence, 39(14), 15186–15194. https://doi.org/10.1609/aaai.v39i14.33666

Issue

Section

AAAI Technical Track on Knowledge Representation and Reasoning