Coupled Multi-Layer Attentions for Co-Extraction of Aspect and Opinion Terms

Authors

  • Wenya Wang Nanyang Technological University
  • Sinno Jialin Pan Nanyang Technological University
  • Daniel Dahlmeier SAP Innovation Center Network
  • Xiaokui Xiao Nanyang Technological University

DOI:

https://doi.org/10.1609/aaai.v31i1.10974

Keywords:

information extraction, deep learning, multi-layer attentions, aspect terms extraction, opinion terms extraction

Abstract

The task of aspect and opinion terms co-extraction aims to explicitly extract aspect terms describing features of an entity and opinion terms expressing emotions from user-generated texts. To achieve this task, one effective approach is to exploit relations between aspect terms and opinion terms by parsing syntactic structure for each sentence. However, this approach requires expensive effort for parsing and highly depends on the quality of the parsing results. In this paper, we offer a novel deep learning model, named coupled multi-layer attentions. The proposed model provides an end-to-end solution and does not require any parsers or other linguistic resources for preprocessing. Specifically, the proposed model is a multi-layer attention network, where each layer consists of a couple of attentions with tensor operators. One attention is for extracting aspect terms, while the other is for extracting opinion terms. They are learned interactively to dually propagate information between aspect terms and opinion terms. Through multiple layers, the model can further exploit indirect relations between terms for more precise information extraction. Experimental results on three benchmark datasets in SemEval Challenge 2014 and 2015 show that our model achieves state-of-the-art performances compared with several baselines.

Downloads

Published

2017-02-12

How to Cite

Wang, W., Pan, S. J., Dahlmeier, D., & Xiao, X. (2017). Coupled Multi-Layer Attentions for Co-Extraction of Aspect and Opinion Terms. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10974