Deep Fusing Pre-trained Models into Neural Machine Translation

Authors

  • Rongxiang Weng School of Computer Science and Technology, Soochow University, Suzhou, China Machine Intelligence Technology Lab, Alibaba Group, Hangzhou, China
  • Heng Yu Machine Intelligence Technology Lab, Alibaba Group, Hangzhou, China
  • Weihua Luo Machine Intelligence Technology Lab, Alibaba Group, Hangzhou, China
  • Min Zhang School of Computer Science and Technology, Soochow University, Suzhou, China

DOI:

https://doi.org/10.1609/aaai.v36i10.21399

Keywords:

Speech & Natural Language Processing (SNLP)

Abstract

Pre-training and fine-tuning have become the de facto paradigm in many natural language processing (NLP) tasks. However, compared to other NLP tasks, neural machine translation (NMT) aims to generate target language sentences through the contextual representation from the source language counterparts. This characteristic means the optimization objective of NMT is far from that of the universal pre-trained models (PTMs), leading to the standard procedure of pre-training and fine-tuning does not work well in NMT. In this paper, we propose a novel framework to deep fuse the pre-trained representation into NMT, fully exploring the potential of PTMs in NMT. Specifically, we directly replace the randomly initialized Transformer encoder with a pre-trained encoder and propose a layer-wise coordination structure to coordinate PTM and NMT decoder learning. Then, we introduce a partitioned multi-task learning method to fine-tune the pre-trained parameter, reducing the gap between PTM and NMT by progressively learning the task-specific representation. Experimental results show that our approach achieves considerable improvements on WMT14 En2De, WMT14 En2Fr, and WMT16 Ro2En translation benchmarks and outperforms previous work in both autoregressive and non-autoregressive NMT models.

Downloads

Published

2022-06-28

How to Cite

Weng, R., Yu, H., Luo, W., & Zhang, M. (2022). Deep Fusing Pre-trained Models into Neural Machine Translation. Proceedings of the AAAI Conference on Artificial Intelligence, 36(10), 11468-11476. https://doi.org/10.1609/aaai.v36i10.21399

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing