Diffusion Models as Constrained Samplers for Optimization with Unknown Constraints

Lingkai Kong, Yuanqi Du, Wenhao Mu, Kirill Neklyudov, Valentin De Bortoli, Dongxia Wu, Haorui Wang, Aaron M Ferber, Yian Ma, Carla P Gomes, Chao Zhang
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:4582-4590, 2025.

Abstract

Addressing real-world optimization problems becomes particularly challenging when analytic objective functions or constraints are unavailable. While numerous studies have addressed the issue of unknown objectives, limited research has focused on scenarios where feasibility constraints are not given explicitly. Overlooking these constraints can lead to spurious solutions that are unrealistic in practice. To deal with such unknown constraints, we propose to perform optimization within the data manifold using diffusion models. To constrain the optimization process to the data manifold, we reformulate the original optimization problem as a sampling problem from the product of the Boltzmann distribution defined by the objective function and the data distribution learned by the diffusion model. Depending on the differentiability of the objective function, we propose two different sampling methods. For differentiable objectives, we propose a two-stage framework that begins with a guided diffusion process for warm-up, followed by a Langevin dynamics stage for further correction. For non-differentiable objectives, we propose an iterative importance sampling strategy using the diffusion model as the proposal distribution. Comprehensive experiments on a synthetic dataset, six real-world black-box optimization datasets, and a multi-objective molecule optimization dataset show that our method achieves better or comparable performance with previous state-of-the-art baselines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-kong25b, title = {Diffusion Models as Constrained Samplers for Optimization with Unknown Constraints}, author = {Kong, Lingkai and Du, Yuanqi and Mu, Wenhao and Neklyudov, Kirill and Bortoli, Valentin De and Wu, Dongxia and Wang, Haorui and Ferber, Aaron M and Ma, Yian and Gomes, Carla P and Zhang, Chao}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {4582--4590}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/kong25b/kong25b.pdf}, url = {https://proceedings.mlr.press/v258/kong25b.html}, abstract = {Addressing real-world optimization problems becomes particularly challenging when analytic objective functions or constraints are unavailable. While numerous studies have addressed the issue of unknown objectives, limited research has focused on scenarios where feasibility constraints are not given explicitly. Overlooking these constraints can lead to spurious solutions that are unrealistic in practice. To deal with such unknown constraints, we propose to perform optimization within the data manifold using diffusion models. To constrain the optimization process to the data manifold, we reformulate the original optimization problem as a sampling problem from the product of the Boltzmann distribution defined by the objective function and the data distribution learned by the diffusion model. Depending on the differentiability of the objective function, we propose two different sampling methods. For differentiable objectives, we propose a two-stage framework that begins with a guided diffusion process for warm-up, followed by a Langevin dynamics stage for further correction. For non-differentiable objectives, we propose an iterative importance sampling strategy using the diffusion model as the proposal distribution. Comprehensive experiments on a synthetic dataset, six real-world black-box optimization datasets, and a multi-objective molecule optimization dataset show that our method achieves better or comparable performance with previous state-of-the-art baselines.} }
Endnote
%0 Conference Paper %T Diffusion Models as Constrained Samplers for Optimization with Unknown Constraints %A Lingkai Kong %A Yuanqi Du %A Wenhao Mu %A Kirill Neklyudov %A Valentin De Bortoli %A Dongxia Wu %A Haorui Wang %A Aaron M Ferber %A Yian Ma %A Carla P Gomes %A Chao Zhang %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-kong25b %I PMLR %P 4582--4590 %U https://proceedings.mlr.press/v258/kong25b.html %V 258 %X Addressing real-world optimization problems becomes particularly challenging when analytic objective functions or constraints are unavailable. While numerous studies have addressed the issue of unknown objectives, limited research has focused on scenarios where feasibility constraints are not given explicitly. Overlooking these constraints can lead to spurious solutions that are unrealistic in practice. To deal with such unknown constraints, we propose to perform optimization within the data manifold using diffusion models. To constrain the optimization process to the data manifold, we reformulate the original optimization problem as a sampling problem from the product of the Boltzmann distribution defined by the objective function and the data distribution learned by the diffusion model. Depending on the differentiability of the objective function, we propose two different sampling methods. For differentiable objectives, we propose a two-stage framework that begins with a guided diffusion process for warm-up, followed by a Langevin dynamics stage for further correction. For non-differentiable objectives, we propose an iterative importance sampling strategy using the diffusion model as the proposal distribution. Comprehensive experiments on a synthetic dataset, six real-world black-box optimization datasets, and a multi-objective molecule optimization dataset show that our method achieves better or comparable performance with previous state-of-the-art baselines.
APA
Kong, L., Du, Y., Mu, W., Neklyudov, K., Bortoli, V.D., Wu, D., Wang, H., Ferber, A.M., Ma, Y., Gomes, C.P. & Zhang, C.. (2025). Diffusion Models as Constrained Samplers for Optimization with Unknown Constraints. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:4582-4590 Available from https://proceedings.mlr.press/v258/kong25b.html.

Related Material