Unsupervised Neural Dialect Translation with Commonality and Diversity Modeling

Authors

  • Yu Wan University of Macau
  • Baosong Yang University of Macau
  • Derek F. Wong University of Macau
  • Lidia S. Chao University of Macau
  • Haihua Du University of Macau
  • Ben C.H. Ao University of Macau

DOI:

https://doi.org/10.1609/aaai.v34i05.6448

Abstract

As a special machine translation task, dialect translation has two main characteristics: 1) lack of parallel training corpus; and 2) possessing similar grammar between two sides of the translation. In this paper, we investigate how to exploit the commonality and diversity between dialects thus to build unsupervised translation models merely accessing to monolingual data. Specifically, we leverage pivot-private embedding, layer coordination, as well as parameter sharing to sufficiently model commonality and diversity among source and target, ranging from lexical, through syntactic, to semantic levels. In order to examine the effectiveness of the proposed models, we collect 20 million monolingual corpus for each of Mandarin and Cantonese, which are official language and the most widely used dialect in China. Experimental results reveal that our methods outperform rule-based simplified and traditional Chinese conversion and conventional unsupervised translation models over 12 BLEU scores.

Downloads

Published

2020-04-03

How to Cite

Wan, Y., Yang, B., Wong, D. F., Chao, L. S., Du, H., & Ao, B. C. (2020). Unsupervised Neural Dialect Translation with Commonality and Diversity Modeling. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 9130-9137. https://doi.org/10.1609/aaai.v34i05.6448

Issue

Section

AAAI Technical Track: Natural Language Processing