Lattice-Based Recurrent Neural Network Encoders for Neural Machine Translation

Authors

  • Jinsong Su Xiamen University
  • Zhixing Tan Xiamen University
  • Deyi Xiong Soochow University
  • Rongrong Ji Xiamen University
  • Xiaodong Shi Xiamen University
  • Yang Liu Tsinghua University

DOI:

https://doi.org/10.1609/aaai.v31i1.10968

Abstract

Neural machine translation (NMT) heavily relies on word-level modelling to learn semantic representations of input sentences.However, for languages without natural word delimiters (e.g., Chinese) where input sentences have to be tokenized first,conventional NMT is confronted with two issues:1) it is difficult to find an optimal tokenization granularity for source sentence modelling, and2) errors in 1-best tokenizations may propagate to the encoder of NMT.To handle these issues, we propose word-lattice based Recurrent Neural Network (RNN) encoders for NMT,which generalize the standard RNN to word lattice topology.The proposed encoders take as input a word lattice that compactly encodes multiple tokenizations, and learn to generate new hidden states from arbitrarily many inputs and hidden states in preceding time steps.As such, the word-lattice based encoders not only alleviate the negative impact of tokenization errors but also are more expressive and flexible to embed input sentences.Experiment results on Chinese-English translation demonstrate the superiorities of the proposed encoders over the conventional encoder.

Downloads

Published

2017-02-12

How to Cite

Su, J., Tan, Z., Xiong, D., Ji, R., Shi, X., & Liu, Y. (2017). Lattice-Based Recurrent Neural Network Encoders for Neural Machine Translation. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10968