Neuron Interaction Based Representation Composition for Neural Machine Translation

Authors

  • Jian Li The Chinese University of Hong Kong
  • Xing Wang Tencent AI Lab
  • Baosong Yang University of Macau
  • Shuming Shi Tencent AI Lab
  • Michael R. Lyu The Chinese University of Hong Kong
  • Zhaopeng Tu Tencent AI Lab

DOI:

https://doi.org/10.1609/aaai.v34i05.6334

Abstract

Recent NLP studies reveal that substantial linguistic information can be attributed to single neurons, i.e., individual dimensions of the representation vectors. We hypothesize that modeling strong interactions among neurons helps to better capture complex information by composing the linguistic properties embedded in individual neurons. Starting from this intuition, we propose a novel approach to compose representations learned by different components in neural machine translation (e.g., multi-layer networks or multi-head attention), based on modeling strong interactions among neurons in the representation vectors. Specifically, we leverage bilinear pooling to model pairwise multiplicative interactions among individual neurons, and a low-rank approximation to make the model computationally feasible. We further propose extended bilinear pooling to incorporate first-order representations. Experiments on WMT14 English⇒German and English⇒French translation tasks show that our model consistently improves performances over the SOTA Transformer baseline. Further analyses demonstrate that our approach indeed captures more syntactic and semantic information as expected.

Downloads

Published

2020-04-03

How to Cite

Li, J., Wang, X., Yang, B., Shi, S., Lyu, M. R., & Tu, Z. (2020). Neuron Interaction Based Representation Composition for Neural Machine Translation. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 8204-8211. https://doi.org/10.1609/aaai.v34i05.6334

Issue

Section

AAAI Technical Track: Natural Language Processing