Simultaneous Learning of Pivots and Representations for Cross-Domain Sentiment Classification

Authors

  • Liang Li Tsinghua University
  • Weirui Ye School of Software, BNRist, Tsinghua University
  • Mingsheng Long Tsinghua University
  • Yateng Tang Tencent Inc.
  • Jin Xu Tencent Inc.
  • Jianmin Wang Tsinghua University

DOI:

https://doi.org/10.1609/aaai.v34i05.6336

Abstract

Cross-domain sentiment classification aims to leverage useful knowledge from a source domain to mitigate the supervision sparsity in a target domain. A series of approaches depend on the pivot features that behave similarly for polarity prediction in both domains. However, the engineering of such pivot features remains cumbersome and prevents us from learning the disentangled and transferable representations from rich semantic and syntactic information. Towards learning the pivots and representations simultaneously, we propose a new Transferable Pivot Transformer (TPT). Our model consists of two networks: a Pivot Selector that learns to detect transferable n-gram pivots from contexts, and a Transferable Transformer that learns to generate domain-invariant representations by modeling the correlation between pivot and non-pivot words. The Pivot Selector and Transferable Transformer are jointly optimized through end-to-end back-propagation. We experiment with real tasks of cross-domain sentiment classification over 20 domain pairs where our model outperforms prior arts.

Downloads

Published

2020-04-03

How to Cite

Li, L., Ye, W., Long, M., Tang, Y., Xu, J., & Wang, J. (2020). Simultaneous Learning of Pivots and Representations for Cross-Domain Sentiment Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 8220-8227. https://doi.org/10.1609/aaai.v34i05.6336

Issue

Section

AAAI Technical Track: Natural Language Processing