Paragraph-level Commonsense Transformers with Recurrent Memory

Authors

  • Saadia Gabriel University of Washington Allen Institute for Artificial Intelligence
  • Chandra Bhagavatula Allen Institute for Artificial Intelligence
  • Vered Shwartz University of Washington Allen Institute for Artificial Intelligence
  • Ronan Le Bras Allen Institute for Artificial Intelligence
  • Maxwell Forbes University of Washington Allen Institute for Artificial Intelligence
  • Yejin Choi University of Washington Allen Institute for Artificial Intelligence

DOI:

https://doi.org/10.1609/aaai.v35i14.17521

Keywords:

Generation, Common-Sense Reasoning, Social Cognition And Interaction, Discourse, Pragmatics & Argument Mining

Abstract

Human understanding of narrative texts requires making commonsense inferences beyond what is stated in the text explicitly. A recent model, COMET, can generate such inferences along several dimensions such as pre- and post-conditions, motivations, and mental states of the participants. However, COMET was trained on short phrases, and is therefore discourse-agnostic. When presented with each sentence of a multi-sentence narrative, it might generate inferences that are inconsistent with the rest of the narrative. We present the task of discourse-aware commonsense inference. Given a sentence within a narrative, the goal is to generate commonsense inferences along predefined dimensions, while maintaining coherence with the rest of the narrative. Such large-scale paragraph-level annotation is hard to get and costly, so we use available sentence-level annotations to efficiently and automatically construct a distantly supervised corpus. Using this corpus, we train PARA-COMET, a discourse-aware model that incorporates paragraph-level information to generate coherent commonsense inferences from narratives. PARA-COMET captures both semantic knowledge pertaining to prior world knowledge, and episodic knowledge involving how current events relate to prior and future events in a narrative. Our results confirm that PARA-COMET outperforms the sentence-level baselines, particularly in generating inferences that are both coherent and novel.

Downloads

Published

2021-05-18

How to Cite

Gabriel, S., Bhagavatula, C., Shwartz, V., Le Bras, R., Forbes, M., & Choi, Y. (2021). Paragraph-level Commonsense Transformers with Recurrent Memory. Proceedings of the AAAI Conference on Artificial Intelligence, 35(14), 12857-12865. https://doi.org/10.1609/aaai.v35i14.17521

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing I