Probabilistic Dependency Graphs

Authors

  • Oliver Richardson Cornell University
  • Joseph Y Halpern Cornell University

DOI:

https://doi.org/10.1609/aaai.v35i13.17445

Keywords:

Graphical Models, Knowledge Representation Languages, Other Foundations of Reasoning under Uncertainty

Abstract

We introduce Probabilistic Dependency Graphs (PDGs), a new class of directed graphical models. PDGs can capture inconsistent beliefs in a natural way and are more modular than Bayesian Networks (BNs), in that they make it easier to incorporate new information and restructure the representation. We show by example how PDGs are an especially natural modeling tool. We provide three semantics for PDGs, each of which can be derived from a scoring function (on joint distributions over the variables in the network) that can be viewed as representing a distribution's incompatibility with the PDG. For the PDG corresponding to a BN, this function is uniquely minimized by the distribution the BN represents, showing that PDG semantics extend BN semantics. We show further that factor graphs and their exponential families can also be faithfully represented as PDGs, while there are significant barriers to modeling a PDG with a factor graph.

Downloads

Published

2021-05-18

How to Cite

Richardson, O., & Halpern, J. Y. (2021). Probabilistic Dependency Graphs. Proceedings of the AAAI Conference on Artificial Intelligence, 35(13), 12174-12181. https://doi.org/10.1609/aaai.v35i13.17445

Issue

Section

AAAI Technical Track on Reasoning under Uncertainty