A Neural Transition-Based Approach for Semantic Dependency Graph Parsing

Authors

  • Yuxuan Wang Harbin Institute of Technology
  • Wanxiang Che Harbin Institute of Technology
  • Jiang Guo Harbin Institute of Technology
  • Ting Liu Harbin Institute of Technology

Keywords:

Semantic Dependency Graph, Transition-Based Parsing, Bi-LSTM Subtraction, Incremental Tree-LSTM

Abstract

Semantic dependency graph has been recently proposed as an extension of tree-structured syntactic or semantic representation for natural language sentences. It particularly features the structural property of multi-head, which allows nodes to have multiple heads, resulting in a directed acyclic graph(DAG) parsing problem. Yet most statistical parsers focused exclusively on shallow bi-lexical tree structures, DAG parsing remains under-explored. In this paper, we propose a neural transition-based parser, using a variant of list-based arc-eager transition algorithm for dependency graph parsing. Particularly, two non-trivial improvements are proposed for representing the key components of the transition system, to better capture the semantics of segments and internal sub-graph structures. We test our parser on the SemEval-2016 Task 9 dataset (Chinese) and the SemEval-2015 Task 18 dataset (English). On both benchmark datasets, we obtain superior or comparable results to the best performing systems. Our parser can be further improved with a simple ensemble mechanism, resulting in the state-of-the-art performance.

Downloads

Published

2018-04-27

How to Cite

Wang, Y., Che, W., Guo, J., & Liu, T. (2018). A Neural Transition-Based Approach for Semantic Dependency Graph Parsing. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/11968