Fast parallel sampling under isoperimetry

Nima Anari, Sinho Chewi, Thuy-Duong Vuong
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:161-185, 2024.

Abstract

We show how to sample in parallel from a distribution $\pi$ over $\mathbb{R}^d$ that satisfies a log-Sobolev inequality and has a smooth log-density, by parallelizing the Langevin (resp. underdamped Langevin) algorithms. We show that our algorithm outputs samples from a distribution $\hat{\pi}$ that is close to $\pi$ in Kullback–Leibler (KL) divergence (resp. total variation (TV) distance), while using only $\log(d)^{O(1)}$ parallel rounds and $\widetilde{O}(d)$ (resp. $\widetilde O(\sqrt d)$) gradient evaluations in total. This constitutes the first parallel sampling algorithms with TV distance guarantees. For our main application, we show how to combine the TV distance guarantees of our algorithms with prior works and obtain RNC sampling-to-counting reductions for families of discrete distribution on the hypercube $\{\pm 1\}^n$ that are closed under exponential tilts and have bounded covariance. Consequently, we obtain an RNC sampler for directed Eulerian tours and asymmetric determinantal point processes, resolving open questions raised in prior works.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-anari24a, title = {Fast parallel sampling under isoperimetry}, author = {Anari, Nima and Chewi, Sinho and Vuong, Thuy-Duong}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {161--185}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/anari24a/anari24a.pdf}, url = {https://proceedings.mlr.press/v247/anari24a.html}, abstract = {We show how to sample in parallel from a distribution $\pi$ over $\mathbb{R}^d$ that satisfies a log-Sobolev inequality and has a smooth log-density, by parallelizing the Langevin (resp. underdamped Langevin) algorithms. We show that our algorithm outputs samples from a distribution $\hat{\pi}$ that is close to $\pi$ in Kullback–Leibler (KL) divergence (resp. total variation (TV) distance), while using only $\log(d)^{O(1)}$ parallel rounds and $\widetilde{O}(d)$ (resp. $\widetilde O(\sqrt d)$) gradient evaluations in total. This constitutes the first parallel sampling algorithms with TV distance guarantees. For our main application, we show how to combine the TV distance guarantees of our algorithms with prior works and obtain RNC sampling-to-counting reductions for families of discrete distribution on the hypercube $\{\pm 1\}^n$ that are closed under exponential tilts and have bounded covariance. Consequently, we obtain an RNC sampler for directed Eulerian tours and asymmetric determinantal point processes, resolving open questions raised in prior works.} }
Endnote
%0 Conference Paper %T Fast parallel sampling under isoperimetry %A Nima Anari %A Sinho Chewi %A Thuy-Duong Vuong %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-anari24a %I PMLR %P 161--185 %U https://proceedings.mlr.press/v247/anari24a.html %V 247 %X We show how to sample in parallel from a distribution $\pi$ over $\mathbb{R}^d$ that satisfies a log-Sobolev inequality and has a smooth log-density, by parallelizing the Langevin (resp. underdamped Langevin) algorithms. We show that our algorithm outputs samples from a distribution $\hat{\pi}$ that is close to $\pi$ in Kullback–Leibler (KL) divergence (resp. total variation (TV) distance), while using only $\log(d)^{O(1)}$ parallel rounds and $\widetilde{O}(d)$ (resp. $\widetilde O(\sqrt d)$) gradient evaluations in total. This constitutes the first parallel sampling algorithms with TV distance guarantees. For our main application, we show how to combine the TV distance guarantees of our algorithms with prior works and obtain RNC sampling-to-counting reductions for families of discrete distribution on the hypercube $\{\pm 1\}^n$ that are closed under exponential tilts and have bounded covariance. Consequently, we obtain an RNC sampler for directed Eulerian tours and asymmetric determinantal point processes, resolving open questions raised in prior works.
APA
Anari, N., Chewi, S. & Vuong, T.. (2024). Fast parallel sampling under isoperimetry. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:161-185 Available from https://proceedings.mlr.press/v247/anari24a.html.

Related Material