A Knowledge-Grounded Neural Conversation Model

Authors

  • Marjan Ghazvininejad Information Sciences Institute, USC
  • Chris Brockett Microsoft
  • Ming-Wei Chang Microsoft
  • Bill Dolan Microsoft
  • Jianfeng Gao Microsoft
  • Wen-tau Yih Microsoft
  • Michel Galley Microsoft

DOI:

https://doi.org/10.1609/aaai.v32i1.11977

Keywords:

NLP, dialogue, conversation models, generation, deep learning

Abstract

Neural network models are capable of generating extremely natural sounding conversational interactions.  However, these models have been mostly applied to casual scenarios (e.g., as “chatbots”) and have yet to demonstrate they can serve in more useful conversational applications. This paper presents a novel, fully data-driven, and knowledge-grounded neural conversation model aimed at producing more contentful responses.  We generalize the widely-used Sequence-to-Sequence (Seq2Seq) approach by conditioning responses on both conversation history and external “facts”, allowing the model to be versatile and applicable in an open-domain setting.  Our approach yields significant improvements over a competitive Seq2Seq baseline. Human judges found that our outputs are significantly more informative.

Downloads

Published

2018-04-27

How to Cite

Ghazvininejad, M., Brockett, C., Chang, M.-W., Dolan, B., Gao, J., Yih, W.- tau, & Galley, M. (2018). A Knowledge-Grounded Neural Conversation Model. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11977