Learning Markov Random Fields for Combinatorial Structures via Sampling through Lovász Local Lemma

1 Dec 2022  ·  Nan Jiang, Yi Gu, Yexiang Xue ·

Learning to generate complex combinatorial structures satisfying constraints will have transformative impacts in many application domains. However, it is beyond the capabilities of existing approaches due to the highly intractable nature of the embedded probabilistic inference. Prior works spend most of the training time learning to separate valid from invalid structures but do not learn the inductive biases of valid structures. We develop NEural Lov\'asz Sampler (Nelson), which embeds the sampler through Lov\'asz Local Lemma (LLL) as a fully differentiable neural network layer. Our Nelson-CD embeds this sampler into the contrastive divergence learning process of Markov random fields. Nelson allows us to obtain valid samples from the current model distribution. Contrastive divergence is then applied to separate these samples from those in the training set. Nelson is implemented as a fully differentiable neural net, taking advantage of the parallelism of GPUs. Experimental results on several real-world domains reveal that Nelson learns to generate 100\% valid structures, while baselines either time out or cannot ensure validity. Nelson also outperforms other approaches in running time, log-likelihood, and MAP scores.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here