Neural Machine Translation with Reconstruction

Authors

  • Zhaopeng Tu Noah's Ark Lab, Huawei Technologies
  • Yang Liu Tsinghua University
  • Lifeng Shang Noah's Ark Lab, Huawei Technologies
  • Xiaohua Liu Noah's Ark Lab, Huawei Technologies
  • Hang Li Noah's Ark Lab, Huawei Technologies

DOI:

https://doi.org/10.1609/aaai.v31i1.10950

Keywords:

neural machine translation, reconstruction, adequacy

Abstract

Although end-to-end Neural Machine Translation (NMT) has achieved remarkable progress in the past two years, it suffers from a major drawback: translations generated by NMT systems often lack of adequacy. It has been widely observed that NMT tends to repeatedly translate some source words while mistakenly ignoring other words. To alleviate this problem, we propose a novel encoder-decoder-reconstructor framework for NMT. The reconstructor, incorporated into the NMT model, manages to reconstruct the input source sentence from the hidden layer of the output target sentence, to ensure that the information in the source side is transformed to the target side as much as possible. Experiments show that the proposed framework significantly improves the adequacy of NMT output and achieves superior translation result over state-of-the-art NMT and statistical MT systems.

Downloads

Published

2017-02-12

How to Cite

Tu, Z., Liu, Y., Shang, L., Liu, X., & Li, H. (2017). Neural Machine Translation with Reconstruction. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10950

Issue

Section

Main Track: NLP and Knowledge Representation