Nearest Neighbour Score Estimators for Diffusion Generative Models

Matthew Niedoba, Dylan Green, Saeid Naderiparizi, Vasileios Lioutas, Jonathan Wilder Lavington, Xiaoxuan Liang, Yunpeng Liu, Ke Zhang, Setareh Dabiri, Adam Scibior, Berend Zwartsenberg, Frank Wood
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:38117-38144, 2024.

Abstract

Score function estimation is the cornerstone of both training and sampling from diffusion generative models. Despite this fact, the most commonly used estimators are either biased neural network approximations or high variance Monte Carlo estimators based on the conditional score. We introduce a novel nearest neighbour score function estimator which utilizes multiple samples from the training set to dramatically decrease estimator variance. We leverage our low variance estimator in two compelling applications. Training consistency models with our estimator, we report a significant increase in both convergence speed and sample quality. In diffusion models, we show that our estimator can replace a learned network for probability-flow ODE integration, opening promising new avenues of future research. Code will be released upon paper acceptance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-niedoba24a, title = {Nearest Neighbour Score Estimators for Diffusion Generative Models}, author = {Niedoba, Matthew and Green, Dylan and Naderiparizi, Saeid and Lioutas, Vasileios and Lavington, Jonathan Wilder and Liang, Xiaoxuan and Liu, Yunpeng and Zhang, Ke and Dabiri, Setareh and Scibior, Adam and Zwartsenberg, Berend and Wood, Frank}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {38117--38144}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/niedoba24a/niedoba24a.pdf}, url = {https://proceedings.mlr.press/v235/niedoba24a.html}, abstract = {Score function estimation is the cornerstone of both training and sampling from diffusion generative models. Despite this fact, the most commonly used estimators are either biased neural network approximations or high variance Monte Carlo estimators based on the conditional score. We introduce a novel nearest neighbour score function estimator which utilizes multiple samples from the training set to dramatically decrease estimator variance. We leverage our low variance estimator in two compelling applications. Training consistency models with our estimator, we report a significant increase in both convergence speed and sample quality. In diffusion models, we show that our estimator can replace a learned network for probability-flow ODE integration, opening promising new avenues of future research. Code will be released upon paper acceptance.} }
Endnote
%0 Conference Paper %T Nearest Neighbour Score Estimators for Diffusion Generative Models %A Matthew Niedoba %A Dylan Green %A Saeid Naderiparizi %A Vasileios Lioutas %A Jonathan Wilder Lavington %A Xiaoxuan Liang %A Yunpeng Liu %A Ke Zhang %A Setareh Dabiri %A Adam Scibior %A Berend Zwartsenberg %A Frank Wood %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-niedoba24a %I PMLR %P 38117--38144 %U https://proceedings.mlr.press/v235/niedoba24a.html %V 235 %X Score function estimation is the cornerstone of both training and sampling from diffusion generative models. Despite this fact, the most commonly used estimators are either biased neural network approximations or high variance Monte Carlo estimators based on the conditional score. We introduce a novel nearest neighbour score function estimator which utilizes multiple samples from the training set to dramatically decrease estimator variance. We leverage our low variance estimator in two compelling applications. Training consistency models with our estimator, we report a significant increase in both convergence speed and sample quality. In diffusion models, we show that our estimator can replace a learned network for probability-flow ODE integration, opening promising new avenues of future research. Code will be released upon paper acceptance.
APA
Niedoba, M., Green, D., Naderiparizi, S., Lioutas, V., Lavington, J.W., Liang, X., Liu, Y., Zhang, K., Dabiri, S., Scibior, A., Zwartsenberg, B. & Wood, F.. (2024). Nearest Neighbour Score Estimators for Diffusion Generative Models. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:38117-38144 Available from https://proceedings.mlr.press/v235/niedoba24a.html.

Related Material