Generalization error bound for denoising score matching under relaxed manifold assumption

Konstantin Yakovlev, Nikita Puchkin
Proceedings of Thirty Eighth Conference on Learning Theory, PMLR 291:5824-5891, 2025.

Abstract

We examine theoretical properties of the denoising score matching estimate. We model the density of observations with a nonparametric Gaussian mixture. We significantly relax the standard manifold assumption allowing the samples step away from the manifold. At the same time, we are still able to leverage a nice distribution structure. We derive non-asymptotic bounds on the approximation and generalization errors of the denoising score matching estimate. The rates of convergence are determined by the intrinsic dimension. Furthermore, our bounds remain valid even if we allow the ambient dimension grow polynomially with the sample size.

Cite this Paper


BibTeX
@InProceedings{pmlr-v291-yakovlev25a, title = {Generalization error bound for denoising score matching under relaxed manifold assumption}, author = {Yakovlev, Konstantin and Puchkin, Nikita}, booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory}, pages = {5824--5891}, year = {2025}, editor = {Haghtalab, Nika and Moitra, Ankur}, volume = {291}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--04 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v291/main/assets/yakovlev25a/yakovlev25a.pdf}, url = {https://proceedings.mlr.press/v291/yakovlev25a.html}, abstract = {We examine theoretical properties of the denoising score matching estimate. We model the density of observations with a nonparametric Gaussian mixture. We significantly relax the standard manifold assumption allowing the samples step away from the manifold. At the same time, we are still able to leverage a nice distribution structure. We derive non-asymptotic bounds on the approximation and generalization errors of the denoising score matching estimate. The rates of convergence are determined by the intrinsic dimension. Furthermore, our bounds remain valid even if we allow the ambient dimension grow polynomially with the sample size.} }
Endnote
%0 Conference Paper %T Generalization error bound for denoising score matching under relaxed manifold assumption %A Konstantin Yakovlev %A Nikita Puchkin %B Proceedings of Thirty Eighth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Nika Haghtalab %E Ankur Moitra %F pmlr-v291-yakovlev25a %I PMLR %P 5824--5891 %U https://proceedings.mlr.press/v291/yakovlev25a.html %V 291 %X We examine theoretical properties of the denoising score matching estimate. We model the density of observations with a nonparametric Gaussian mixture. We significantly relax the standard manifold assumption allowing the samples step away from the manifold. At the same time, we are still able to leverage a nice distribution structure. We derive non-asymptotic bounds on the approximation and generalization errors of the denoising score matching estimate. The rates of convergence are determined by the intrinsic dimension. Furthermore, our bounds remain valid even if we allow the ambient dimension grow polynomially with the sample size.
APA
Yakovlev, K. & Puchkin, N.. (2025). Generalization error bound for denoising score matching under relaxed manifold assumption. Proceedings of Thirty Eighth Conference on Learning Theory, in Proceedings of Machine Learning Research 291:5824-5891 Available from https://proceedings.mlr.press/v291/yakovlev25a.html.

Related Material