What’s the score? Automated Denoising Score Matching for Nonlinear Diffusions

Raghav Singhal, Mark Goldstein, Rajesh Ranganath
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:45734-45758, 2024.

Abstract

Reversing a diffusion process by learning its score forms the heart of diffusion-based generative modeling and for estimating properties of scientific systems. The diffusion processes that are tractable center on linear processes with a Gaussian stationary distribution, limiting the kinds of models that can be built to those that target a Gaussian prior or more generally limits the kinds of problems that can be generically solved to those that have conditionally linear score functions. In this work, we introduce a family of tractable denoising score matching objectives, called local-DSM, built using local increments of the diffusion process. We show how local-DSM melded with Taylor expansions enables automated training and score estimation with nonlinear diffusion processes. To demonstrate these ideas, we use automated-DSM to train generative models using non-Gaussian priors on challenging low dimensional distributions and the CIFAR10 image dataset. Additionally, we use the automated-DSM to learn the scores for nonlinear processes studied in statistical physics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-singhal24a, title = {What’s the score? {A}utomated Denoising Score Matching for Nonlinear Diffusions}, author = {Singhal, Raghav and Goldstein, Mark and Ranganath, Rajesh}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {45734--45758}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/singhal24a/singhal24a.pdf}, url = {https://proceedings.mlr.press/v235/singhal24a.html}, abstract = {Reversing a diffusion process by learning its score forms the heart of diffusion-based generative modeling and for estimating properties of scientific systems. The diffusion processes that are tractable center on linear processes with a Gaussian stationary distribution, limiting the kinds of models that can be built to those that target a Gaussian prior or more generally limits the kinds of problems that can be generically solved to those that have conditionally linear score functions. In this work, we introduce a family of tractable denoising score matching objectives, called local-DSM, built using local increments of the diffusion process. We show how local-DSM melded with Taylor expansions enables automated training and score estimation with nonlinear diffusion processes. To demonstrate these ideas, we use automated-DSM to train generative models using non-Gaussian priors on challenging low dimensional distributions and the CIFAR10 image dataset. Additionally, we use the automated-DSM to learn the scores for nonlinear processes studied in statistical physics.} }
Endnote
%0 Conference Paper %T What’s the score? Automated Denoising Score Matching for Nonlinear Diffusions %A Raghav Singhal %A Mark Goldstein %A Rajesh Ranganath %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-singhal24a %I PMLR %P 45734--45758 %U https://proceedings.mlr.press/v235/singhal24a.html %V 235 %X Reversing a diffusion process by learning its score forms the heart of diffusion-based generative modeling and for estimating properties of scientific systems. The diffusion processes that are tractable center on linear processes with a Gaussian stationary distribution, limiting the kinds of models that can be built to those that target a Gaussian prior or more generally limits the kinds of problems that can be generically solved to those that have conditionally linear score functions. In this work, we introduce a family of tractable denoising score matching objectives, called local-DSM, built using local increments of the diffusion process. We show how local-DSM melded with Taylor expansions enables automated training and score estimation with nonlinear diffusion processes. To demonstrate these ideas, we use automated-DSM to train generative models using non-Gaussian priors on challenging low dimensional distributions and the CIFAR10 image dataset. Additionally, we use the automated-DSM to learn the scores for nonlinear processes studied in statistical physics.
APA
Singhal, R., Goldstein, M. & Ranganath, R.. (2024). What’s the score? Automated Denoising Score Matching for Nonlinear Diffusions. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:45734-45758 Available from https://proceedings.mlr.press/v235/singhal24a.html.

Related Material