Variational Reasoning for Question Answering With Knowledge Graph

Authors

  • Yuyu Zhang Georgia Institute of Technology
  • Hanjun Dai Georgia Institute of Technology
  • Zornitsa Kozareva Amazon Web Services
  • Alexander Smola Amazon Web Services
  • Le Song Georgia Institute of Technology

DOI:

https://doi.org/10.1609/aaai.v32i1.12057

Keywords:

Question Answering

Abstract

Knowledge graph (KG) is known to be helpful for the task of question answering (QA), since it provides well-structured relational information between entities, and allows one to further infer indirect facts. However, it is challenging to build QA systems which can learn to reason over knowledge graphs based on question-answer pairs alone. First, when people ask questions, their expressions are noisy (for example, typos in texts, or variations in pronunciations), which is non-trivial for the QA system to match those mentioned entities to the knowledge graph. Second, many questions require multi-hop logic reasoning over the knowledge graph to retrieve the answers. To address these challenges, we propose a novel and unified deep learning architecture, and an end-to-end variational learning algorithm which can handle noise in questions, and learn multi-hop reasoning simultaneously. Our method achieves state-of-the-art performance on a recent benchmark dataset in the literature. We also derive a series of new benchmark datasets, including questions for multi-hop reasoning, questions paraphrased by neural translation model, and questions in human voice. Our method yields very promising results on all these challenging datasets.

Downloads

Published

2018-04-26

How to Cite

Zhang, Y., Dai, H., Kozareva, Z., Smola, A., & Song, L. (2018). Variational Reasoning for Question Answering With Knowledge Graph. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.12057