A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization

Sebastian Sanokowski, Sepp Hochreiter, Sebastian Lehner
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:43346-43367, 2024.

Abstract

Learning to sample from intractable distributions over discrete sets without relying on corresponding training data is a central problem in a wide range of fields, including Combinatorial Optimization. Currently, popular deep learning-based approaches rely primarily on generative models that yield exact sample likelihoods. This work introduces a method that lifts this restriction and opens the possibility to employ highly expressive latent variable models like diffusion models. Our approach is conceptually based on a loss that upper bounds the reverse Kullback-Leibler divergence and evades the requirement of exact sample likelihoods. We experimentally validate our approach in data-free Combinatorial Optimization and demonstrate that our method achieves a new state-of-the-art on a wide range of benchmark problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-sanokowski24a, title = {A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization}, author = {Sanokowski, Sebastian and Hochreiter, Sepp and Lehner, Sebastian}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {43346--43367}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/sanokowski24a/sanokowski24a.pdf}, url = {https://proceedings.mlr.press/v235/sanokowski24a.html}, abstract = {Learning to sample from intractable distributions over discrete sets without relying on corresponding training data is a central problem in a wide range of fields, including Combinatorial Optimization. Currently, popular deep learning-based approaches rely primarily on generative models that yield exact sample likelihoods. This work introduces a method that lifts this restriction and opens the possibility to employ highly expressive latent variable models like diffusion models. Our approach is conceptually based on a loss that upper bounds the reverse Kullback-Leibler divergence and evades the requirement of exact sample likelihoods. We experimentally validate our approach in data-free Combinatorial Optimization and demonstrate that our method achieves a new state-of-the-art on a wide range of benchmark problems.} }
Endnote
%0 Conference Paper %T A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization %A Sebastian Sanokowski %A Sepp Hochreiter %A Sebastian Lehner %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-sanokowski24a %I PMLR %P 43346--43367 %U https://proceedings.mlr.press/v235/sanokowski24a.html %V 235 %X Learning to sample from intractable distributions over discrete sets without relying on corresponding training data is a central problem in a wide range of fields, including Combinatorial Optimization. Currently, popular deep learning-based approaches rely primarily on generative models that yield exact sample likelihoods. This work introduces a method that lifts this restriction and opens the possibility to employ highly expressive latent variable models like diffusion models. Our approach is conceptually based on a loss that upper bounds the reverse Kullback-Leibler divergence and evades the requirement of exact sample likelihoods. We experimentally validate our approach in data-free Combinatorial Optimization and demonstrate that our method achieves a new state-of-the-art on a wide range of benchmark problems.
APA
Sanokowski, S., Hochreiter, S. & Lehner, S.. (2024). A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:43346-43367 Available from https://proceedings.mlr.press/v235/sanokowski24a.html.

Related Material