Dynamic Neuro-Symbolic Knowledge Graph Construction for Zero-shot Commonsense Question Answering

Authors

  • Antoine Bosselut Stanford University Allen Institute for AI
  • Ronan Le Bras Allen Institute for AI
  • Yejin Choi University of Washington Allen Institute for AI

Keywords:

Neuro-Symbolic AI (NSAI)

Abstract

Understanding narratives requires reasoning about implicit world knowledge related to the causes, effects, and states of situations described in text. At the core of this challenge is how to access contextually relevant knowledge on demand and reason over it. In this paper, we present initial studies toward zero-shot commonsense question answering by formulating the task as inference over dynamically generated commonsense knowledge graphs. In contrast to previous studies for knowledge integration that rely on retrieval of existing knowledge from static knowledge graphs, our study requires commonsense knowledge integration where contextually relevant knowledge is often not present in existing knowledge bases. Therefore, we present a novel approach that generates contextually-relevant symbolic knowledge structures on demand using generative neural commonsense knowledge models. Empirical results on two datasets demonstrate the efficacy of our neuro-symbolic approach for dynamically constructing knowledge graphs for reasoning. Our approach achieves significant performance boosts over pretrained language models and vanilla knowledge models, all while providing interpretable reasoning paths for its predictions.

Downloads

Published

2021-05-18

How to Cite

Bosselut, A., Le Bras, R., & Choi, Y. (2021). Dynamic Neuro-Symbolic Knowledge Graph Construction for Zero-shot Commonsense Question Answering. Proceedings of the AAAI Conference on Artificial Intelligence, 35(6), 4923-4931. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/16625

Issue

Section

AAAI Technical Track Focus Area on Neuro-Symbolic AI