Convergence of Consistency Model with Multistep Sampling under General Data Assumptions

Yiding Chen, Yiyi Zhang, Owen Oertell, Wen Sun
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:9764-9782, 2025.

Abstract

Diffusion models accomplish remarkable success in data generation tasks across various domains. However, the iterative sampling process is computationally expensive. Consistency models are proposed to learn consistency functions to map from noise to data directly, which allows one-step fast data generation and multistep sampling to improve sample quality. In this paper, we study the convergence of consistency models when the self-consistency property holds approximately under the training distribution. Our analysis requires only mild data assumption and applies to a family of forward processes. When the target data distribution has bounded support or has tails that decay sufficiently fast, we show that the samples generated by the consistency model are close to the target distribution in Wasserstein distance; when the target distribution satisfies some smoothness assumption, we show that with an additional perturbation step for smoothing, the generated samples are close to the target distribution in total variation distance. We provide two case studies with commonly chosen forward processes to demonstrate the benefit of multistep sampling.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-chen25ck, title = {Convergence of Consistency Model with Multistep Sampling under General Data Assumptions}, author = {Chen, Yiding and Zhang, Yiyi and Oertell, Owen and Sun, Wen}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {9764--9782}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/chen25ck/chen25ck.pdf}, url = {https://proceedings.mlr.press/v267/chen25ck.html}, abstract = {Diffusion models accomplish remarkable success in data generation tasks across various domains. However, the iterative sampling process is computationally expensive. Consistency models are proposed to learn consistency functions to map from noise to data directly, which allows one-step fast data generation and multistep sampling to improve sample quality. In this paper, we study the convergence of consistency models when the self-consistency property holds approximately under the training distribution. Our analysis requires only mild data assumption and applies to a family of forward processes. When the target data distribution has bounded support or has tails that decay sufficiently fast, we show that the samples generated by the consistency model are close to the target distribution in Wasserstein distance; when the target distribution satisfies some smoothness assumption, we show that with an additional perturbation step for smoothing, the generated samples are close to the target distribution in total variation distance. We provide two case studies with commonly chosen forward processes to demonstrate the benefit of multistep sampling.} }
Endnote
%0 Conference Paper %T Convergence of Consistency Model with Multistep Sampling under General Data Assumptions %A Yiding Chen %A Yiyi Zhang %A Owen Oertell %A Wen Sun %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-chen25ck %I PMLR %P 9764--9782 %U https://proceedings.mlr.press/v267/chen25ck.html %V 267 %X Diffusion models accomplish remarkable success in data generation tasks across various domains. However, the iterative sampling process is computationally expensive. Consistency models are proposed to learn consistency functions to map from noise to data directly, which allows one-step fast data generation and multistep sampling to improve sample quality. In this paper, we study the convergence of consistency models when the self-consistency property holds approximately under the training distribution. Our analysis requires only mild data assumption and applies to a family of forward processes. When the target data distribution has bounded support or has tails that decay sufficiently fast, we show that the samples generated by the consistency model are close to the target distribution in Wasserstein distance; when the target distribution satisfies some smoothness assumption, we show that with an additional perturbation step for smoothing, the generated samples are close to the target distribution in total variation distance. We provide two case studies with commonly chosen forward processes to demonstrate the benefit of multistep sampling.
APA
Chen, Y., Zhang, Y., Oertell, O. & Sun, W.. (2025). Convergence of Consistency Model with Multistep Sampling under General Data Assumptions. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:9764-9782 Available from https://proceedings.mlr.press/v267/chen25ck.html.

Related Material