Diffusion Models are Minimax Optimal Distribution Estimators

Kazusato Oko, Shunta Akiyama, Taiji Suzuki
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:26517-26582, 2023.

Abstract

While efficient distribution learning is no doubt behind the groundbreaking success of diffusion modeling, its theoretical guarantees are quite limited. In this paper, we provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling for well-known function spaces. The highlight of this paper is that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates in the total variation distance and in the Wasserstein distance of order one. Furthermore, we extend our theory to demonstrate how diffusion models adapt to low-dimensional data distributions. We expect these results advance theoretical understandings of diffusion modeling and its ability to generate verisimilar outputs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-oko23a, title = {Diffusion Models are Minimax Optimal Distribution Estimators}, author = {Oko, Kazusato and Akiyama, Shunta and Suzuki, Taiji}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {26517--26582}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/oko23a/oko23a.pdf}, url = {https://proceedings.mlr.press/v202/oko23a.html}, abstract = {While efficient distribution learning is no doubt behind the groundbreaking success of diffusion modeling, its theoretical guarantees are quite limited. In this paper, we provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling for well-known function spaces. The highlight of this paper is that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates in the total variation distance and in the Wasserstein distance of order one. Furthermore, we extend our theory to demonstrate how diffusion models adapt to low-dimensional data distributions. We expect these results advance theoretical understandings of diffusion modeling and its ability to generate verisimilar outputs.} }
Endnote
%0 Conference Paper %T Diffusion Models are Minimax Optimal Distribution Estimators %A Kazusato Oko %A Shunta Akiyama %A Taiji Suzuki %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-oko23a %I PMLR %P 26517--26582 %U https://proceedings.mlr.press/v202/oko23a.html %V 202 %X While efficient distribution learning is no doubt behind the groundbreaking success of diffusion modeling, its theoretical guarantees are quite limited. In this paper, we provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling for well-known function spaces. The highlight of this paper is that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates in the total variation distance and in the Wasserstein distance of order one. Furthermore, we extend our theory to demonstrate how diffusion models adapt to low-dimensional data distributions. We expect these results advance theoretical understandings of diffusion modeling and its ability to generate verisimilar outputs.
APA
Oko, K., Akiyama, S. & Suzuki, T.. (2023). Diffusion Models are Minimax Optimal Distribution Estimators. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:26517-26582 Available from https://proceedings.mlr.press/v202/oko23a.html.

Related Material