Adaptive Multi-Compositionality for Recursive Neural Models with Applications to Sentiment Analysis

Authors

  • Li Dong Beihang University
  • Furu Wei Microsoft Research
  • Ming Zhou Microsoft Research
  • Ke Xu Beihang University

DOI:

https://doi.org/10.1609/aaai.v28i1.8930

Keywords:

recursive neural network, semantic composition, deep learning, sentiment classification

Abstract

Recursive neural models have achieved promising results in many natural language processing tasks. The main difference among these models lies in the composition function, i.e., how to obtain the vector representation for a phrase or sentence using the representations of words it contains. This paper introduces a novel Adaptive Multi-Compositionality (AdaMC) layer to recursive neural models. The basic idea is to use more than one composition functions and adaptively select them depending on the input vectors. We present a general framework to model each semantic composition as a distribution over these composition functions. The composition functions and parameters used for adaptive selection are learned jointly from data. We integrate AdaMC into existing recursive neural models and conduct extensive experiments on the Stanford Sentiment Treebank. The results illustrate that AdaMC significantly outperforms state-of-the-art sentiment classification methods. It helps push the best accuracy of sentence-level negative/positive classification from 85.4% up to 88.5%.

Downloads

Published

2014-06-21

How to Cite

Dong, L., Wei, F., Zhou, M., & Xu, K. (2014). Adaptive Multi-Compositionality for Recursive Neural Models with Applications to Sentiment Analysis. Proceedings of the AAAI Conference on Artificial Intelligence, 28(1). https://doi.org/10.1609/aaai.v28i1.8930