Sequential Copying Networks

Authors

  • Qingyu Zhou Harbin Institute of Technology
  • Nan Yang Microsoft Research
  • Furu Wei Microsoft Research
  • Ming Zhou Microsoft Research

DOI:

https://doi.org/10.1609/aaai.v32i1.11915

Keywords:

Summarization, Question Generation

Abstract

Copying mechanism shows effectiveness in sequence-to-sequence based neural network models for text generation tasks, such as abstractive sentence summarization and question generation. However, existing works on modeling copying or pointing mechanism only considers single word copying from the source sentences. In this paper, we propose a novel copying framework, named Sequential Copying Networks (SeqCopyNet), which not only learns to copy single words, but also copies sequences from the input sentence. It leverages the pointer networks to explicitly select a sub-span from the source side to target side, and integrates this sequential copying mechanism to the generation process in the encoder-decoder paradigm. Experiments on abstractive sentence summarization and question generation tasks show that the proposed SeqCopyNet can copy meaningful spans and outperforms the baseline models.

Downloads

Published

2018-04-26

How to Cite

Zhou, Q., Yang, N., Wei, F., & Zhou, M. (2018). Sequential Copying Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11915

Issue

Section

Main Track: NLP and Knowledge Representation