Translation Prediction with Source Dependency-Based Context Representation

Authors

  • Kehai Chen Harbin Institute of Technology
  • Tiejun Zhao Harbin Institute of Technology
  • Muyun Yang Harbin Institute of Technology
  • Lemao Liu National Institute of Information and Communications Technology

DOI:

https://doi.org/10.1609/aaai.v31i1.10978

Keywords:

translation prediction, source dependency, neural Network, context representation, statistical machine translation

Abstract

Learning context representations is very promising to improve translation results, particularly through neural networks. Previous efforts process the context words sequentially and neglect their internal syntactic structure. In this paper, we propose a novel neural network based on bi-convolutional architecture to represent the source dependency-based context for translation prediction. The proposed model is able to not only encode the long-distance dependencies but also capture the functional similarities for better translation prediction (i.e., ambiguous words translation and word forms translation). Examined by a large-scale Chinese-English translation task, the proposed approach achieves a significant improvement (of up to +1.9 BLEU points) over the baseline system, and meanwhile outperforms a number of context-enhanced comparison system.

Downloads

Published

2017-02-12

How to Cite

Chen, K., Zhao, T., Yang, M., & Liu, L. (2017). Translation Prediction with Source Dependency-Based Context Representation. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10978