Dependency Grammar Induction with a Neural Variational Transition-Based Parser

Authors

  • Bowen Li University of Edinburgh
  • Jianpeng Cheng University of Edinburgh
  • Yang Liu University of Edinburgh
  • Frank Keller University of Edinburgh

DOI:

https://doi.org/10.1609/aaai.v33i01.33016658

Abstract

Dependency grammar induction is the task of learning dependency syntax without annotated training data. Traditional graph-based models with global inference achieve state-ofthe-art results on this task but they require O(n3) run time. Transition-based models enable faster inference with O(n) time complexity, but their performance still lags behind. In this work, we propose a neural transition-based parser for dependency grammar induction, whose inference procedure utilizes rich neural features with O(n) time complexity. We train the parser with an integration of variational inference, posterior regularization and variance reduction techniques. The resulting framework outperforms previous unsupervised transition-based dependency parsers and achieves performance comparable to graph-based models, both on the English Penn Treebank and on the Universal Dependency Treebank. In an empirical comparison, we show that our approach substantially increases parsing speed over graphbased models.

Downloads

Published

2019-07-17

How to Cite

Li, B., Cheng, J., Liu, Y., & Keller, F. (2019). Dependency Grammar Induction with a Neural Variational Transition-Based Parser. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 6658-6665. https://doi.org/10.1609/aaai.v33i01.33016658

Issue

Section

AAAI Technical Track: Natural Language Processing