Convergence Analysis for General Probability Flow ODEs of Diffusion Models in Wasserstein Distances

Xuefeng Gao, Lingjiong Zhu
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:1009-1017, 2025.

Abstract

Score-based generative modeling with probability flow ordinary differential equations (ODEs) has achieved remarkable success in a variety of applications. While various fast ODE-based samplers have been proposed in the literature and employed in practice, the theoretical understandings about convergence properties of the probability flow ODE are still quite limited. In this paper, we provide the first non-asymptotic convergence analysis for a general class of probability flow ODE samplers in 2-Wasserstein distance, assuming accurate score estimates and smooth log-concave data distributions. We then consider various examples and establish results on the iteration complexity of the corresponding ODE-based samplers. Our proof technique relies on spelling out explicitly the contraction rate for the continuous-time ODE and analyzing the discretization and score-matching errors by using synchronous coupling; the challenge in our analysis mainly arises from the inherent non-autonomy of the probability flow ODE and the specific exponential integrator that we study.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-gao25c, title = {Convergence Analysis for General Probability Flow ODEs of Diffusion Models in Wasserstein Distances}, author = {Gao, Xuefeng and Zhu, Lingjiong}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {1009--1017}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/gao25c/gao25c.pdf}, url = {https://proceedings.mlr.press/v258/gao25c.html}, abstract = {Score-based generative modeling with probability flow ordinary differential equations (ODEs) has achieved remarkable success in a variety of applications. While various fast ODE-based samplers have been proposed in the literature and employed in practice, the theoretical understandings about convergence properties of the probability flow ODE are still quite limited. In this paper, we provide the first non-asymptotic convergence analysis for a general class of probability flow ODE samplers in 2-Wasserstein distance, assuming accurate score estimates and smooth log-concave data distributions. We then consider various examples and establish results on the iteration complexity of the corresponding ODE-based samplers. Our proof technique relies on spelling out explicitly the contraction rate for the continuous-time ODE and analyzing the discretization and score-matching errors by using synchronous coupling; the challenge in our analysis mainly arises from the inherent non-autonomy of the probability flow ODE and the specific exponential integrator that we study.} }
Endnote
%0 Conference Paper %T Convergence Analysis for General Probability Flow ODEs of Diffusion Models in Wasserstein Distances %A Xuefeng Gao %A Lingjiong Zhu %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-gao25c %I PMLR %P 1009--1017 %U https://proceedings.mlr.press/v258/gao25c.html %V 258 %X Score-based generative modeling with probability flow ordinary differential equations (ODEs) has achieved remarkable success in a variety of applications. While various fast ODE-based samplers have been proposed in the literature and employed in practice, the theoretical understandings about convergence properties of the probability flow ODE are still quite limited. In this paper, we provide the first non-asymptotic convergence analysis for a general class of probability flow ODE samplers in 2-Wasserstein distance, assuming accurate score estimates and smooth log-concave data distributions. We then consider various examples and establish results on the iteration complexity of the corresponding ODE-based samplers. Our proof technique relies on spelling out explicitly the contraction rate for the continuous-time ODE and analyzing the discretization and score-matching errors by using synchronous coupling; the challenge in our analysis mainly arises from the inherent non-autonomy of the probability flow ODE and the specific exponential integrator that we study.
APA
Gao, X. & Zhu, L.. (2025). Convergence Analysis for General Probability Flow ODEs of Diffusion Models in Wasserstein Distances. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:1009-1017 Available from https://proceedings.mlr.press/v258/gao25c.html.

Related Material