Computing Divergences between Discrete Decomposable Models

Authors

  • Loong Kuan Lee Monash University
  • Nico Piatkowski Fraunhofer IAIS
  • François Petitjean Monash University
  • Geoffrey I. Webb Monash University

DOI:

https://doi.org/10.1609/aaai.v37i10.26443

Keywords:

RU: Graphical Model, RU: Stochastic Models & Probabilistic Inference

Abstract

There are many applications that benefit from computing the exact divergence between 2 discrete probability measures, including machine learning. Unfortunately, in the absence of any assumptions on the structure or independencies within these distributions, computing the divergence between them is an intractable problem in high dimensions. We show that we are able to compute a wide family of functionals and divergences, such as the alpha-beta divergence, between two decomposable models, i.e. chordal Markov networks, in time exponential to the treewidth of these models. The alpha-beta divergence is a family of divergences that include popular divergences such as the Kullback-Leibler divergence, the Hellinger distance, and the chi-squared divergence. Thus, we can accurately compute the exact values of any of this broad class of divergences to the extent to which we can accurately model the two distributions using decomposable models.

Downloads

Published

2023-06-26

How to Cite

Lee, L. K., Piatkowski, N., Petitjean, F., & Webb, G. I. (2023). Computing Divergences between Discrete Decomposable Models. Proceedings of the AAAI Conference on Artificial Intelligence, 37(10), 12243-12251. https://doi.org/10.1609/aaai.v37i10.26443

Issue

Section

AAAI Technical Track on Reasoning Under Uncertainty