Linguistic Properties Matter for Implicit Discourse Relation Recognition: Combining Semantic Interaction, Topic Continuity and Attribution

Authors

  • Wenqiang Lei National University of Singapore
  • Yuanxin Xiang National University of Singapore
  • Yuwei Wang University of Utah
  • Qian Zhong City University of Hong Kong
  • Meichun Liu City University of Hong Kong
  • Min-Yen Kan National University of Singapore; Smart Systems Institute

Keywords:

natural language processing, linguistics, discourse relation, feature-based model

Abstract

Modern solutions for implicit discourse relation recognition largely build universal models to classify all of the different types of discourse relations. In contrast to such learning models, we build our model from first principles, analyzing the linguistic properties of the individual top-level Penn Discourse Treebank (PDTB) styled implicit discourse relations: Comparison, Contingency and Expansion. We find semantic characteristics of each relation type and two cohesion devices---topic continuity and attribution---work together to contribute such linguistic properties. We encode those properties as complex features and feed them into a NaiveBayes classifier, bettering baselines(including deep neural network ones) to achieve a new state-of-the-art performance level. Over a strong, feature-based baseline, our system outperforms one-versus-other binary classification by 4.83% for Comparison relation, 3.94% for Contingency and 2.22% for four-way classification.

Downloads

Published

2018-04-26

How to Cite

Lei, W., Xiang, Y., Wang, Y., Zhong, Q., Liu, M., & Kan, M.-Y. (2018). Linguistic Properties Matter for Implicit Discourse Relation Recognition: Combining Semantic Interaction, Topic Continuity and Attribution. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/11933

Issue

Section

Main Track: NLP and Knowledge Representation