Learning Markov Random Fields for Combinatorial Structures via Sampling through Lovász Local Lemma


  • Nan Jiang Purdue University
  • Yi Gu Northwestern University
  • Yexiang Xue Purdue University




CSO: Constraint Satisfaction


Learning to generate complex combinatorial structures satisfying constraints will have transformative impacts in many application domains. However, it is beyond the capabilities of existing approaches due to the highly intractable nature of the embedded probabilistic inference. Prior works spend most of the training time learning to separate valid from invalid structures but do not learn the inductive biases of valid structures. We develop NEural Lovasz Sampler (NELSON), which embeds the sampler through Lovasz Local Lemma (LLL) as a fully differentiable neural network layer. Our NELSON-CD embeds this sampler into the contrastive divergence learning process of Markov random fields. NELSON allows us to obtain valid samples from the current model distribution. Contrastive divergence is then applied to separate these samples from those in the training set. NELSON is implemented as a fully differentiable neural net, taking advantage of the parallelism of GPUs. Experimental results on several real-world domains reveal that NELSON learns to generate 100% valid structures, while baselines either time out or cannot ensure validity. NELSON also outperforms other approaches in running time, log-likelihood, and MAP scores.




How to Cite

Jiang, N., Gu, Y., & Xue, Y. (2023). Learning Markov Random Fields for Combinatorial Structures via Sampling through Lovász Local Lemma. Proceedings of the AAAI Conference on Artificial Intelligence, 37(4), 4016-4024. https://doi.org/10.1609/aaai.v37i4.25516



AAAI Technical Track on Constraint Satisfaction and Optimization