Diffusion Posterior Sampling is Computationally Intractable

Shivam Gupta, Ajil Jalal, Aditya Parulekar, Eric Price, Zhiyang Xun
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:17020-17059, 2024.

Abstract

Diffusion models are a remarkably effective way of learning and sampling from a distribution $p(x)$. In posterior sampling, one is also given a measurement model $p(y \mid x)$ and a measurement $y$, and would like to sample from $p(x \mid y)$. Posterior sampling is useful for tasks such as inpainting, super-resolution, and MRI reconstruction, so a number of recent works have given algorithms to heuristically approximate it; but none are known to converge to the correct distribution in polynomial time. In this paper we show that posterior sampling is computationally intractable: under the most basic assumption in cryptography—that one-way functions exist—there are instances for which every algorithm takes superpolynomial time, even though unconditional sampling is provably fast. We also show that the exponential-time rejection sampling algorithm is essentially optimal under the stronger plausible assumption that there are one-way functions that take exponential time to invert.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-gupta24a, title = {Diffusion Posterior Sampling is Computationally Intractable}, author = {Gupta, Shivam and Jalal, Ajil and Parulekar, Aditya and Price, Eric and Xun, Zhiyang}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {17020--17059}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/gupta24a/gupta24a.pdf}, url = {https://proceedings.mlr.press/v235/gupta24a.html}, abstract = {Diffusion models are a remarkably effective way of learning and sampling from a distribution $p(x)$. In posterior sampling, one is also given a measurement model $p(y \mid x)$ and a measurement $y$, and would like to sample from $p(x \mid y)$. Posterior sampling is useful for tasks such as inpainting, super-resolution, and MRI reconstruction, so a number of recent works have given algorithms to heuristically approximate it; but none are known to converge to the correct distribution in polynomial time. In this paper we show that posterior sampling is computationally intractable: under the most basic assumption in cryptography—that one-way functions exist—there are instances for which every algorithm takes superpolynomial time, even though unconditional sampling is provably fast. We also show that the exponential-time rejection sampling algorithm is essentially optimal under the stronger plausible assumption that there are one-way functions that take exponential time to invert.} }
Endnote
%0 Conference Paper %T Diffusion Posterior Sampling is Computationally Intractable %A Shivam Gupta %A Ajil Jalal %A Aditya Parulekar %A Eric Price %A Zhiyang Xun %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-gupta24a %I PMLR %P 17020--17059 %U https://proceedings.mlr.press/v235/gupta24a.html %V 235 %X Diffusion models are a remarkably effective way of learning and sampling from a distribution $p(x)$. In posterior sampling, one is also given a measurement model $p(y \mid x)$ and a measurement $y$, and would like to sample from $p(x \mid y)$. Posterior sampling is useful for tasks such as inpainting, super-resolution, and MRI reconstruction, so a number of recent works have given algorithms to heuristically approximate it; but none are known to converge to the correct distribution in polynomial time. In this paper we show that posterior sampling is computationally intractable: under the most basic assumption in cryptography—that one-way functions exist—there are instances for which every algorithm takes superpolynomial time, even though unconditional sampling is provably fast. We also show that the exponential-time rejection sampling algorithm is essentially optimal under the stronger plausible assumption that there are one-way functions that take exponential time to invert.
APA
Gupta, S., Jalal, A., Parulekar, A., Price, E. & Xun, Z.. (2024). Diffusion Posterior Sampling is Computationally Intractable. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:17020-17059 Available from https://proceedings.mlr.press/v235/gupta24a.html.

Related Material