KPT: Keyword-Guided Pre-training for Grounded Dialog Generation

Authors

  • Qi Zhu Tsinghua University
  • Fei Mi Huawei Noah’s Ark Lab
  • Zheng Zhang Tsinghua University
  • Yasheng Wang Huawei Noah’s Ark Lab
  • Yitong Li Huawei Technologies Co., Ltd.
  • Xin Jiang Huawei Noah's Ark Lab
  • Qun Liu Huawei Noah's Ark Lab
  • Xiaoyan Zhu Tsinghua University
  • Minlie Huang Tsinghua University

DOI:

https://doi.org/10.1609/aaai.v37i11.26646

Keywords:

SNLP: Conversational AI/Dialogue Systems, SNLP: Generation

Abstract

Incorporating external knowledge into the response generation process is essential to building more helpful and reliable dialog agents. However, collecting knowledge-grounded conversations is often costly, calling for a better pre-trained model for grounded dialog generation that generalizes well w.r.t. different types of knowledge. In this work, we propose KPT (Keyword-guided Pre-Training), a novel self-supervised pre-training method for grounded dialog generation without relying on extra knowledge annotation. Specifically, we use a pre-trained language model to extract the most uncertain tokens in the dialog as keywords. With these keywords, we construct two kinds of knowledge and pre-train a knowledge-grounded response generation model, aiming at handling two different scenarios: (1) the knowledge should be faithfully grounded; (2) it can be selectively used. For the former, the grounding knowledge consists of keywords extracted from the response. For the latter, the grounding knowledge is additionally augmented with keywords extracted from other utterances in the same dialog. Since the knowledge is extracted from the dialog itself, KPT can be easily performed on a large volume and variety of dialogue data. We considered three data sources (open-domain, task-oriented, conversational QA) with a total of 2.5M dialogues. We conduct extensive experiments on various few-shot knowledge-grounded generation tasks, including grounding on dialog acts, knowledge graphs, persona descriptions, and Wikipedia passages. Our comprehensive experiments and analyses demonstrate that KPT consistently outperforms state-of-the-art methods on these tasks with diverse grounding knowledge.

Downloads

Published

2023-06-26

How to Cite

Zhu, Q., Mi, F., Zhang, Z., Wang, Y., Li, Y., Jiang, X., Liu, Q., Zhu, X., & Huang, M. (2023). KPT: Keyword-Guided Pre-training for Grounded Dialog Generation. Proceedings of the AAAI Conference on Artificial Intelligence, 37(11), 14065-14073. https://doi.org/10.1609/aaai.v37i11.26646

Issue

Section

AAAI Technical Track on Speech & Natural Language Processing