Unifying Bayesian Flow Networks and Diffusion Models through Stochastic Differential Equations

Kaiwen Xue, Yuhao Zhou, Shen Nie, Xu Min, Xiaolu Zhang, Jun Zhou, Chongxuan Li
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:55656-55681, 2024.

Abstract

Bayesian flow networks (BFNs) iteratively refine the parameters, instead of the samples in diffusion models (DMs), of distributions at various noise levels through Bayesian inference. Owing to its differentiable nature, BFNs are promising in modeling both continuous and discrete data, while simultaneously maintaining fast sampling capabilities. This paper aims to understand and enhance BFNs by connecting them with DMs through stochastic differential equations (SDEs). We identify the linear SDEs corresponding to the noise-addition processes in BFNs, demonstrate that BFN’s regression losses are aligned with denoise score matching, and validate the sampler in BFN as a first-order solver for the respective reverse-time SDE. Based on these findings and existing recipes of fast sampling in DMs, we propose specialized solvers for BFNs that markedly surpass the original BFN sampler in terms of sample quality with a limited number of function evaluations (e.g., 10) on both image and text datasets. Notably, our best sampler achieves an increase in speed of $5\sim20$ times for free.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-xue24d, title = {Unifying {B}ayesian Flow Networks and Diffusion Models through Stochastic Differential Equations}, author = {Xue, Kaiwen and Zhou, Yuhao and Nie, Shen and Min, Xu and Zhang, Xiaolu and Zhou, Jun and Li, Chongxuan}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {55656--55681}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/xue24d/xue24d.pdf}, url = {https://proceedings.mlr.press/v235/xue24d.html}, abstract = {Bayesian flow networks (BFNs) iteratively refine the parameters, instead of the samples in diffusion models (DMs), of distributions at various noise levels through Bayesian inference. Owing to its differentiable nature, BFNs are promising in modeling both continuous and discrete data, while simultaneously maintaining fast sampling capabilities. This paper aims to understand and enhance BFNs by connecting them with DMs through stochastic differential equations (SDEs). We identify the linear SDEs corresponding to the noise-addition processes in BFNs, demonstrate that BFN’s regression losses are aligned with denoise score matching, and validate the sampler in BFN as a first-order solver for the respective reverse-time SDE. Based on these findings and existing recipes of fast sampling in DMs, we propose specialized solvers for BFNs that markedly surpass the original BFN sampler in terms of sample quality with a limited number of function evaluations (e.g., 10) on both image and text datasets. Notably, our best sampler achieves an increase in speed of $5\sim20$ times for free.} }
Endnote
%0 Conference Paper %T Unifying Bayesian Flow Networks and Diffusion Models through Stochastic Differential Equations %A Kaiwen Xue %A Yuhao Zhou %A Shen Nie %A Xu Min %A Xiaolu Zhang %A Jun Zhou %A Chongxuan Li %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-xue24d %I PMLR %P 55656--55681 %U https://proceedings.mlr.press/v235/xue24d.html %V 235 %X Bayesian flow networks (BFNs) iteratively refine the parameters, instead of the samples in diffusion models (DMs), of distributions at various noise levels through Bayesian inference. Owing to its differentiable nature, BFNs are promising in modeling both continuous and discrete data, while simultaneously maintaining fast sampling capabilities. This paper aims to understand and enhance BFNs by connecting them with DMs through stochastic differential equations (SDEs). We identify the linear SDEs corresponding to the noise-addition processes in BFNs, demonstrate that BFN’s regression losses are aligned with denoise score matching, and validate the sampler in BFN as a first-order solver for the respective reverse-time SDE. Based on these findings and existing recipes of fast sampling in DMs, we propose specialized solvers for BFNs that markedly surpass the original BFN sampler in terms of sample quality with a limited number of function evaluations (e.g., 10) on both image and text datasets. Notably, our best sampler achieves an increase in speed of $5\sim20$ times for free.
APA
Xue, K., Zhou, Y., Nie, S., Min, X., Zhang, X., Zhou, J. & Li, C.. (2024). Unifying Bayesian Flow Networks and Diffusion Models through Stochastic Differential Equations. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:55656-55681 Available from https://proceedings.mlr.press/v235/xue24d.html.

Related Material