Latent Dependency Forest Models

Authors

  • Shanbo Chu ShanghaiTech University
  • Yong Jiang ShanghaiTech University
  • Kewei Tu ShanghaiTech University

DOI:

https://doi.org/10.1609/aaai.v31i1.11047

Keywords:

probabilistic modeling, latent dependency forest model, non-projective dependency grammar

Abstract

Probabilistic modeling is one of the foundations of modern machine learning and artificial intelligence. In this paper, we propose a novel type of probabilistic models named latent dependency forest models (LDFMs). A LDFM models the dependencies between random variables with a forest structure that can change dynamically based on the variable values. It is therefore capable of modeling context-specific independence. We parameterize a LDFM using a first-order non-projective dependency grammar. Learning LDFMs from data can be formulated purely as a parameter learning problem, and hence the difficult problem of model structure learning is circumvented. Our experimental results show that LDFMs are competitive with existing probabilistic models.

Downloads

Published

2017-02-12

How to Cite

Chu, S., Jiang, Y., & Tu, K. (2017). Latent Dependency Forest Models. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.11047

Issue

Section

AAAI Technical Track: Reasoning under Uncertainty