Conditioning diffusion models by explicit forward-backward bridging

Adrien Corenflos, Zheng Zhao, Thomas B. Schön, Simo Särkkä, Jens Sjölund
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:3709-3717, 2025.

Abstract

Given an unconditional diffusion model targeting a joint model $\pi(x, y)$, using it to perform conditional simulation $\pi(x \mid y)$ is still largely an open question and is typically achieved by learning conditional drifts to the denoising SDE after the fact. In this work, we express \emph{exact} conditional simulation within the \emph{approximate} diffusion model as an inference problem on an augmented space corresponding to a partial SDE bridge. This perspective allows us to implement efficient and principled particle Gibbs and pseudo-marginal samplers marginally targeting the conditional distribution $\pi(x \mid y)$. Contrary to existing methodology, our methods do not introduce any additional approximation to the unconditional diffusion model aside from the Monte Carlo error. We showcase the benefits and drawbacks of our approach on a series of synthetic and real data examples.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-corenflos25a, title = {Conditioning diffusion models by explicit forward-backward bridging}, author = {Corenflos, Adrien and Zhao, Zheng and Sch{\"o}n, Thomas B. and S{\"a}rkk{\"a}, Simo and Sj{\"o}lund, Jens}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {3709--3717}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/corenflos25a/corenflos25a.pdf}, url = {https://proceedings.mlr.press/v258/corenflos25a.html}, abstract = {Given an unconditional diffusion model targeting a joint model $\pi(x, y)$, using it to perform conditional simulation $\pi(x \mid y)$ is still largely an open question and is typically achieved by learning conditional drifts to the denoising SDE after the fact. In this work, we express \emph{exact} conditional simulation within the \emph{approximate} diffusion model as an inference problem on an augmented space corresponding to a partial SDE bridge. This perspective allows us to implement efficient and principled particle Gibbs and pseudo-marginal samplers marginally targeting the conditional distribution $\pi(x \mid y)$. Contrary to existing methodology, our methods do not introduce any additional approximation to the unconditional diffusion model aside from the Monte Carlo error. We showcase the benefits and drawbacks of our approach on a series of synthetic and real data examples.} }
Endnote
%0 Conference Paper %T Conditioning diffusion models by explicit forward-backward bridging %A Adrien Corenflos %A Zheng Zhao %A Thomas B. Schön %A Simo Särkkä %A Jens Sjölund %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-corenflos25a %I PMLR %P 3709--3717 %U https://proceedings.mlr.press/v258/corenflos25a.html %V 258 %X Given an unconditional diffusion model targeting a joint model $\pi(x, y)$, using it to perform conditional simulation $\pi(x \mid y)$ is still largely an open question and is typically achieved by learning conditional drifts to the denoising SDE after the fact. In this work, we express \emph{exact} conditional simulation within the \emph{approximate} diffusion model as an inference problem on an augmented space corresponding to a partial SDE bridge. This perspective allows us to implement efficient and principled particle Gibbs and pseudo-marginal samplers marginally targeting the conditional distribution $\pi(x \mid y)$. Contrary to existing methodology, our methods do not introduce any additional approximation to the unconditional diffusion model aside from the Monte Carlo error. We showcase the benefits and drawbacks of our approach on a series of synthetic and real data examples.
APA
Corenflos, A., Zhao, Z., Schön, T.B., Särkkä, S. & Sjölund, J.. (2025). Conditioning diffusion models by explicit forward-backward bridging. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:3709-3717 Available from https://proceedings.mlr.press/v258/corenflos25a.html.

Related Material