Improved Denoising Diffusion Probabilistic Models

Alexander Quinn Nichol, Prafulla Dhariwal
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:8162-8171, 2021.

Abstract

Denoising diffusion probabilistic models (DDPM) are a class of generative models which have recently been shown to produce excellent samples. We show that with a few simple modifications, DDPMs can also achieve competitive log-likelihoods while maintaining high sample quality. Additionally, we find that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality, which is important for the practical deployment of these models. We additionally use precision and recall to compare how well DDPMs and GANs cover the target distribution. Finally, we show that the sample quality and likelihood of these models scale smoothly with model capacity and training compute, making them easily scalable. We release our code and pre-trained models at https://github.com/openai/improved-diffusion.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-nichol21a, title = {Improved Denoising Diffusion Probabilistic Models}, author = {Nichol, Alexander Quinn and Dhariwal, Prafulla}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {8162--8171}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/nichol21a/nichol21a.pdf}, url = {https://proceedings.mlr.press/v139/nichol21a.html}, abstract = {Denoising diffusion probabilistic models (DDPM) are a class of generative models which have recently been shown to produce excellent samples. We show that with a few simple modifications, DDPMs can also achieve competitive log-likelihoods while maintaining high sample quality. Additionally, we find that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality, which is important for the practical deployment of these models. We additionally use precision and recall to compare how well DDPMs and GANs cover the target distribution. Finally, we show that the sample quality and likelihood of these models scale smoothly with model capacity and training compute, making them easily scalable. We release our code and pre-trained models at https://github.com/openai/improved-diffusion.} }
Endnote
%0 Conference Paper %T Improved Denoising Diffusion Probabilistic Models %A Alexander Quinn Nichol %A Prafulla Dhariwal %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-nichol21a %I PMLR %P 8162--8171 %U https://proceedings.mlr.press/v139/nichol21a.html %V 139 %X Denoising diffusion probabilistic models (DDPM) are a class of generative models which have recently been shown to produce excellent samples. We show that with a few simple modifications, DDPMs can also achieve competitive log-likelihoods while maintaining high sample quality. Additionally, we find that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality, which is important for the practical deployment of these models. We additionally use precision and recall to compare how well DDPMs and GANs cover the target distribution. Finally, we show that the sample quality and likelihood of these models scale smoothly with model capacity and training compute, making them easily scalable. We release our code and pre-trained models at https://github.com/openai/improved-diffusion.
APA
Nichol, A.Q. & Dhariwal, P.. (2021). Improved Denoising Diffusion Probabilistic Models. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:8162-8171 Available from https://proceedings.mlr.press/v139/nichol21a.html.

Related Material