TY - JOUR AU - Zhao, Yang AU - Zhang, Jiajun AU - Zong, Chengqing AU - He, Zhongjun AU - Wu, Hua PY - 2019/07/17 Y2 - 2024/03/28 TI - Addressing the Under-Translation Problem from the Entropy Perspective JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 33 IS - 01 SE - AAAI Technical Track: AI and the Web DO - 10.1609/aaai.v33i01.3301451 UR - https://ojs.aaai.org/index.php/AAAI/article/view/3817 SP - 451-458 AB - <p>Neural Machine Translation (NMT) has drawn much attention due to its promising translation performance in recent years. However, the under-translation problem still remains a big challenge. In this paper, we focus on the under-translation problem and attempt to find out what kinds of source words are more likely to be ignored. Through analysis, we observe that a source word with a large translation entropy is more inclined to be dropped. To address this problem, we propose a coarse-to-fine framework. In coarse-grained phase, we introduce a simple strategy to reduce the entropy of highentropy words through constructing the pseudo target sentences. In fine-grained phase, we propose three methods, including pre-training method, multitask method and two-pass method, to encourage the neural model to correctly translate these high-entropy words. Experimental results on various translation tasks show that our method can significantly improve the translation quality and substantially reduce the under-translation cases of high-entropy words.</p> ER -