Distributional Diffusion Models with Scoring Rules

Valentin De Bortoli, Alexandre Galashov, J Swaroop Guntupalli, Guangyao Zhou, Kevin Patrick Murphy, Arthur Gretton, Arnaud Doucet
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:12632-12676, 2025.

Abstract

Diffusion models generate high-quality synthetic data. They operate by defining a continuous-time forward process which gradually adds Gaussian noise to data until fully corrupted. The corresponding reverse process progressively “denoises" a Gaussian sample into a sample from the data distribution. However, generating high-quality outputs requires many discretization steps to obtain a faithful approximation of the reverse process. This is expensive and has motivated the development of many acceleration methods. We propose to speed up sample generation by learning the posterior distribution of clean data samples given their noisy versions, instead of only the mean of this distribution. This allows us to sample from the probability transitions of the reverse process on a coarse time scale, significantly accelerating inference with minimal degradation of the quality of the output. This is accomplished by replacing the standard regression loss used to estimate conditional means with a scoring rule. We validate our method on image and robot trajectory generation, where we consistently outperform standard diffusion models at few discretization steps.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-de-bortoli25b, title = {Distributional Diffusion Models with Scoring Rules}, author = {De Bortoli, Valentin and Galashov, Alexandre and Guntupalli, J Swaroop and Zhou, Guangyao and Murphy, Kevin Patrick and Gretton, Arthur and Doucet, Arnaud}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {12632--12676}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/de-bortoli25b/de-bortoli25b.pdf}, url = {https://proceedings.mlr.press/v267/de-bortoli25b.html}, abstract = {Diffusion models generate high-quality synthetic data. They operate by defining a continuous-time forward process which gradually adds Gaussian noise to data until fully corrupted. The corresponding reverse process progressively “denoises" a Gaussian sample into a sample from the data distribution. However, generating high-quality outputs requires many discretization steps to obtain a faithful approximation of the reverse process. This is expensive and has motivated the development of many acceleration methods. We propose to speed up sample generation by learning the posterior distribution of clean data samples given their noisy versions, instead of only the mean of this distribution. This allows us to sample from the probability transitions of the reverse process on a coarse time scale, significantly accelerating inference with minimal degradation of the quality of the output. This is accomplished by replacing the standard regression loss used to estimate conditional means with a scoring rule. We validate our method on image and robot trajectory generation, where we consistently outperform standard diffusion models at few discretization steps.} }
Endnote
%0 Conference Paper %T Distributional Diffusion Models with Scoring Rules %A Valentin De Bortoli %A Alexandre Galashov %A J Swaroop Guntupalli %A Guangyao Zhou %A Kevin Patrick Murphy %A Arthur Gretton %A Arnaud Doucet %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-de-bortoli25b %I PMLR %P 12632--12676 %U https://proceedings.mlr.press/v267/de-bortoli25b.html %V 267 %X Diffusion models generate high-quality synthetic data. They operate by defining a continuous-time forward process which gradually adds Gaussian noise to data until fully corrupted. The corresponding reverse process progressively “denoises" a Gaussian sample into a sample from the data distribution. However, generating high-quality outputs requires many discretization steps to obtain a faithful approximation of the reverse process. This is expensive and has motivated the development of many acceleration methods. We propose to speed up sample generation by learning the posterior distribution of clean data samples given their noisy versions, instead of only the mean of this distribution. This allows us to sample from the probability transitions of the reverse process on a coarse time scale, significantly accelerating inference with minimal degradation of the quality of the output. This is accomplished by replacing the standard regression loss used to estimate conditional means with a scoring rule. We validate our method on image and robot trajectory generation, where we consistently outperform standard diffusion models at few discretization steps.
APA
De Bortoli, V., Galashov, A., Guntupalli, J.S., Zhou, G., Murphy, K.P., Gretton, A. & Doucet, A.. (2025). Distributional Diffusion Models with Scoring Rules. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:12632-12676 Available from https://proceedings.mlr.press/v267/de-bortoli25b.html.

Related Material