Conditioning in First-Order Knowledge Compilation and Lifted Probabilistic Inference

Authors

  • Guy Van den Broeck KU Leuven
  • Jesse Davis KU Leuven

DOI:

https://doi.org/10.1609/aaai.v26i1.8404

Keywords:

lifted inference, knowledge compilation

Abstract

Knowledge compilation is a powerful technique for compactly representing and efficiently reasoning about logical knowledge bases. It has been successfully applied to numerous problems in artificial intelligence, such as probabilistic inference and conformant planning. Conditioning, which updates a knowledge base with observed truth values for some propositions, is one of the fundamental operations employed for reasoning. In the propositional setting, conditioning can be efficiently applied in all cases. Recently, people have explored compilation for first-order knowledge bases. The majority of this work has centered around using first-order d-DNNF circuits as the target compilation language. However, conditioning has not been studied in this setting. This paper explores how to condition a first-order d-DNNF circuit. We show that it is possible to efficiently condition these circuits on unary relations. However, we prove that conditioning on higher arity relations is #P-hard. We study the implications of these findings on the application of performing lifted inference for first-order probabilistic models.This leads to a better understanding of which types of queries lifted inference can address.

Downloads

Published

2021-09-20

How to Cite

Van den Broeck, G., & Davis, J. (2021). Conditioning in First-Order Knowledge Compilation and Lifted Probabilistic Inference. Proceedings of the AAAI Conference on Artificial Intelligence, 26(1), 1961-1967. https://doi.org/10.1609/aaai.v26i1.8404