TY - JOUR AU - Vaezipoor, Pashootan AU - Lederman, Gil AU - Wu, Yuhuai AU - Maddison, Chris AU - Grosse, Roger B AU - Seshia, Sanjit A. AU - Bacchus, Fahiem PY - 2021/05/18 Y2 - 2024/03/28 TI - Learning Branching Heuristics for Propositional Model Counting JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 35 IS - 14 SE - AAAI Technical Track on Search and Optimization DO - 10.1609/aaai.v35i14.17474 UR - https://ojs.aaai.org/index.php/AAAI/article/view/17474 SP - 12427-12435 AB - Propositional model counting, or #SAT, is the problem of computing the number of satisfying assignments of a Boolean formula. Many problems from different application areas, including many discrete probabilistic inference problems, can be translated into model counting problems to be solved by #SAT solvers. Exact #SAT solvers, however, are often not scalable to industrial size instances. In this paper, we present Neuro#, an approach for learning branching heuristics to improve the performance of exact #SAT solvers on instances from a given family of problems. We experimentally show that our method reduces the step counton similarly distributed held-out instances and generalizes to much larger instances from the same problem family. It is able to achieve these results on a number of different problem families having very different structures.In addition to step count improvements, Neuro# can also achieve orders of magnitude wall-clock speedups over the vanilla solver on larger instances in some problem families, despite the runtime overhead of querying the model. ER -