Efficient Lifting for Online Probabilistic Inference

Authors

  • Aniruddh Nath University of Washington
  • Pedro Domingos University of Washington

DOI:

https://doi.org/10.1609/aaai.v24i1.7763

Keywords:

Probabilistic Inference, Relational Probabilistic Models, Graphical Models

Abstract

Lifting can greatly reduce the cost of inference on first-order probabilistic graphical models, but constructing the lifted network can itself be quite costly. In online applications (e.g., video segmentation) repeatedly constructing the lifted network for each new inference can be extremely wasteful, because the evidence typically changes little from one inference to the next. The same is true in many other problems that require repeated inference, like utility maximization, MAP inference, interactive inference, parameter and structure learning, etc. In this paper, we propose an efficient algorithm for updating the structure of an existing lifted network with incremental changes to the evidence. This allows us to construct the lifted network once for the initial inference problem, and amortize the cost over the subsequent problems. Experiments on video segmentation and viral marketing problems show that the algorithm greatly reduces the cost of inference without affecting the quality of the solutions.

Downloads

Published

2010-07-04

How to Cite

Nath, A., & Domingos, P. (2010). Efficient Lifting for Online Probabilistic Inference. Proceedings of the AAAI Conference on Artificial Intelligence, 24(1), 1193-1198. https://doi.org/10.1609/aaai.v24i1.7763