TY - JOUR AU - Gupta, Ankush AU - Agarwal, Arvind AU - Singh, Prawaan AU - Rai, Piyush PY - 2018/04/27 Y2 - 2024/03/28 TI - A Deep Generative Framework for Paraphrase Generation JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 32 IS - 1 SE - Main Track: NLP and Machine Learning DO - 10.1609/aaai.v32i1.11956 UR - https://ojs.aaai.org/index.php/AAAI/article/view/11956 SP - AB - <p> <pre>Paraphrase generation is an important problem in <span>NLP</span>, especially in question answering, information retrieval, information extraction, conversation systems, to name a few. In this paper, we address the problem of generating paraphrases automatically. Our proposed method is based on a combination of deep generative models (<span>VAE</span>) with sequence-to-sequence models (<span>LSTM</span>) to generate paraphrases, given an input sentence. Traditional <span>VAEs</span> when combined with recurrent neural networks can generate free text but they are not suitable for paraphrase generation for a given sentence. We address this problem by conditioning the both, encoder and decoder sides of <span>VAE</span>, on the original sentence, so that it can generate the given sentence's paraphrases. Unlike most existing models, our model is simple, modular and can generate multiple paraphrases, for a given sentence. Quantitative evaluation of the proposed method on a benchmark paraphrase dataset demonstrates its efficacy, and its performance improvement over the state-of-the-art methods by a significant margin, whereas qualitative human evaluation indicate that the generated paraphrases are well-formed, grammatically correct, and are relevant to the input sentence. Furthermore, we evaluate our method on a newly released question paraphrase dataset, and establish a new baseline for future research.</pre> </p> ER -