Learning Sentiment-Specific Word Embedding via Global Sentiment Representation

Authors

  • Peng Fu Institute of Information Engineering, Chinese Academic of Sciences
  • Zheng Lin Institute of Information Engineering, Chinese Academic of Sciences
  • Fengcheng Yuan Institute of Information Engineering, Chinese Academic of Sciences
  • Weiping Wang Institute of Information Engineering, Chinese Academic of Sciences
  • Dan Meng Institute of Information Engineering, Chinese Academic of Sciences

Keywords:

Sentiment word embedding

Abstract

Context-based word embedding learning approaches can model rich semantic and syntactic information. However, it is problematic for sentiment analysis because the words with similar contexts but opposite sentiment polarities, such as good and bad, are mapped into close word vectors in the embedding space. Recently, some sentiment embedding learning methods have been proposed, but most of them are designed to work well on sentence-level texts. Directly applying those models to document-level texts often leads to unsatisfied results. To address this issue, we present a sentiment-specific word embedding learning architecture that utilizes local context informationas well as global sentiment representation. The architecture is applicable for both sentence-level and document-level texts. We take global sentiment representation as a simple average of word embeddings in the text, and use a corruption strategy as a sentiment-dependent regularization. Extensive experiments conducted on several benchmark datasets demonstrate that the proposed architecture outperforms the state-of-the-art methods for sentiment classification.

Downloads

Published

2018-04-26

How to Cite

Fu, P., Lin, Z., Yuan, F., Wang, W., & Meng, D. (2018). Learning Sentiment-Specific Word Embedding via Global Sentiment Representation. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/11916

Issue

Section

Main Track: NLP and Knowledge Representation