Sentence Ordering and Coherence Modeling using Recurrent Neural Networks

Authors

  • Lajanugen Logeswaran University of Michigan
  • Honglak Lee University of Michigan
  • Dragomir Radev Yale University

DOI:

https://doi.org/10.1609/aaai.v32i1.11997

Keywords:

Sentence Ordering, Coherence modeling, coherence, discourse coherence

Abstract

Modeling the structure of coherent texts is a key NLP problem. The task of coherently organizing a given set of sentences has been commonly used to build and evaluate models that understand such structure. We propose an end-to-end unsupervised deep learning approach based on the set-to-sequence framework to address this problem. Our model strongly outperforms prior methods in the order discrimination task and a novel task of ordering abstracts from scientific articles. Furthermore, our work shows that useful text representations can be obtained by learning to order sentences. Visualizing the learned sentence representations shows that the model captures high-level logical structure in paragraphs. Our representations perform comparably to state-of-the-art pre-training methods on sentence similarity and paraphrase detection tasks.

Downloads

Published

2018-04-27

How to Cite

Logeswaran, L., Lee, H., & Radev, D. (2018). Sentence Ordering and Coherence Modeling using Recurrent Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11997