Analogical Chaining with Natural Language Instruction for Commonsense Reasoning

Authors

  • Joseph Blass Northwestern University
  • Kenneth Forbus Northwestern University

DOI:

https://doi.org/10.1609/aaai.v31i1.11153

Keywords:

Analogical Reasoning, Commonsense Reasoning, Natural Language Understanding

Abstract

Understanding commonsense reasoning is one of the core challenges of AI. We are exploring an approach inspired by cognitive science, called analogical chaining, to create cognitive systems that can perform commonsense reasoning. Just as rules are chained in deductive systems, multiple analogies build upon each other’s inferences in analogical chaining. The cases used in analogical chaining – called common sense units – are small, to provide inferential focus and broader transfer. Importantly, such common sense units can be learned via natural language instruction, thereby increasing the ease of extending such systems. This paper describes analogical chaining, natural language instruction via microstories, and some subtleties that arise in controlling reasoning. The utility of this technique is demonstrated by performance of an implemented system on problems from the Choice of Plausible Alternatives test of commonsense causal reasoning.

Downloads

Published

2017-02-12

How to Cite

Blass, J., & Forbus, K. (2017). Analogical Chaining with Natural Language Instruction for Commonsense Reasoning. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.11153