Entailment Relation Aware Paraphrase Generation

Authors

  • Abhilasha Sancheti University of Maryland, College Park Adobe Research
  • Balaji Vasan Srinivasan Adobe Research
  • Rachel Rudinger University of Maryland, College Park

DOI:

https://doi.org/10.1609/aaai.v36i10.21376

Keywords:

Speech & Natural Language Processing (SNLP)

Abstract

We introduce a new task of entailment relation aware paraphrase generation which aims at generating a paraphrase conforming to a given entailment relation (e.g. equivalent, forward entailing, or reverse entailing) with respect to a given input. We propose a reinforcement learning-based weakly-supervised paraphrasing system, ERAP, that can be trained using existing paraphrase and natural language inference (NLI) corpora without an explicit task-specific corpus. A combination of automated and human evaluations show that ERAP generates paraphrases conforming to the specified entailment relation and are of good quality as compared to the baselines and uncontrolled paraphrasing systems. Using ERAP for augmenting training data for downstream textual entailment task improves performance over an uncontrolled paraphrasing system, and introduces fewer training artifacts, indicating the benefit of explicit control during paraphrasing.

Downloads

Published

2022-06-28

How to Cite

Sancheti, A., Srinivasan, B. V., & Rudinger, R. (2022). Entailment Relation Aware Paraphrase Generation. Proceedings of the AAAI Conference on Artificial Intelligence, 36(10), 11258-11266. https://doi.org/10.1609/aaai.v36i10.21376

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing