Meta-Curriculum Learning for Domain Adaptation in Neural Machine Translation

Authors

  • Runzhe Zhan University of Macau
  • Xuebo Liu University of Macau
  • Derek F. Wong University of Macau
  • Lidia S. Chao University of Macau

Keywords:

Machine Translation & Multilinguality

Abstract

Meta-learning has been sufficiently validated to be beneficial for low-resource neural machine translation (NMT). However, we find that meta-trained NMT fails to improve the translation performance of the domain unseen at the meta-training stage. In this paper, we aim to alleviate this issue by proposing a novel meta-curriculum learning for domain adaptation in NMT. During meta-training, the NMT first learns the similar curricula from each domain to avoid falling into a bad local optimum early, and finally learns the curricula of individualities to improve the model robustness for learning domain-specific knowledge. Experimental results on 10 different low-resource domains show that meta-curriculum learning can improve the translation performance of both familiar and unfamiliar domains. All the codes and data are freely available at https://github.com/NLP2CT/Meta-Curriculum.

Downloads

Published

2021-05-18

How to Cite

Zhan, R., Liu, X., Wong, D. F., & Chao, L. S. (2021). Meta-Curriculum Learning for Domain Adaptation in Neural Machine Translation. Proceedings of the AAAI Conference on Artificial Intelligence, 35(16), 14310-14318. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17683

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing III