Conditional PSDDs: Modeling and Learning With Modular Knowledge

Authors

  • Yujia Shen University of California, Los Angeles
  • Arthur Choi University of California, Los Angeles
  • Adnan Darwiche University of California, Los Angeles

DOI:

https://doi.org/10.1609/aaai.v32i1.12119

Keywords:

Bayesian Networks, Decision Diagrams, Learning, Constraints

Abstract

Probabilistic Sentential Decision Diagrams (PSDDs) have been proposed for learning tractable probability distributions from a combination of data and background knowledge (in the form of Boolean constraints). In this paper, we propose a variant on PSDDs, called conditional PSDDs, for representing a family of distributions that are conditioned on the same set of variables. Conditional PSDDs can also be learned from a combination of data and (modular) background knowledge. We use conditional PSDDs to define a more structured version of Bayesian networks, in which nodes can have an exponential number of states, hence expanding the scope of domains where Bayesian networks can be applied. Compared to classical PSDDs, the new representation exploits the independencies captured by a Bayesian network to decompose the learning process into localized learning tasks, which enables the learning of better models while using less computation. We illustrate the promise of conditional PSDDs and structured Bayesian networks empirically, and by providing a case study to the modeling of distributions over routes on a map.

Downloads

Published

2018-04-26

How to Cite

Shen, Y., Choi, A., & Darwiche, A. (2018). Conditional PSDDs: Modeling and Learning With Modular Knowledge. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.12119

Issue

Section

AAAI Technical Track: Reasoning under Uncertainty