Improved analysis for a proximal algorithm for sampling

Yongxin Chen, Sinho Chewi, Adil Salim, Andre Wibisono
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:2984-3014, 2022.

Abstract

We study the proximal sampler of Lee, Shen, and Tian (2021) and obtain new convergence guarantees under weaker assumptions than strong log-concavity: namely, our results hold for (1) weakly log-concave targets, and (2) targets satisfying isoperimetric assumptions which allow for non-log-concavity. We demonstrate our results by obtaining new state-of-the-art sampling guarantees for several classes of target distributions. We also strengthen the connection between the proximal sampler and the proximal method in optimization by interpreting the former as an entropically regularized Wasserstein gradient flow and the latter as the limit of one.

Cite this Paper


BibTeX
@InProceedings{pmlr-v178-chen22c, title = {Improved analysis for a proximal algorithm for sampling}, author = {Chen, Yongxin and Chewi, Sinho and Salim, Adil and Wibisono, Andre}, booktitle = {Proceedings of Thirty Fifth Conference on Learning Theory}, pages = {2984--3014}, year = {2022}, editor = {Loh, Po-Ling and Raginsky, Maxim}, volume = {178}, series = {Proceedings of Machine Learning Research}, month = {02--05 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v178/chen22c/chen22c.pdf}, url = {https://proceedings.mlr.press/v178/chen22c.html}, abstract = {We study the proximal sampler of Lee, Shen, and Tian (2021) and obtain new convergence guarantees under weaker assumptions than strong log-concavity: namely, our results hold for (1) weakly log-concave targets, and (2) targets satisfying isoperimetric assumptions which allow for non-log-concavity. We demonstrate our results by obtaining new state-of-the-art sampling guarantees for several classes of target distributions. We also strengthen the connection between the proximal sampler and the proximal method in optimization by interpreting the former as an entropically regularized Wasserstein gradient flow and the latter as the limit of one.} }
Endnote
%0 Conference Paper %T Improved analysis for a proximal algorithm for sampling %A Yongxin Chen %A Sinho Chewi %A Adil Salim %A Andre Wibisono %B Proceedings of Thirty Fifth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2022 %E Po-Ling Loh %E Maxim Raginsky %F pmlr-v178-chen22c %I PMLR %P 2984--3014 %U https://proceedings.mlr.press/v178/chen22c.html %V 178 %X We study the proximal sampler of Lee, Shen, and Tian (2021) and obtain new convergence guarantees under weaker assumptions than strong log-concavity: namely, our results hold for (1) weakly log-concave targets, and (2) targets satisfying isoperimetric assumptions which allow for non-log-concavity. We demonstrate our results by obtaining new state-of-the-art sampling guarantees for several classes of target distributions. We also strengthen the connection between the proximal sampler and the proximal method in optimization by interpreting the former as an entropically regularized Wasserstein gradient flow and the latter as the limit of one.
APA
Chen, Y., Chewi, S., Salim, A. & Wibisono, A.. (2022). Improved analysis for a proximal algorithm for sampling. Proceedings of Thirty Fifth Conference on Learning Theory, in Proceedings of Machine Learning Research 178:2984-3014 Available from https://proceedings.mlr.press/v178/chen22c.html.

Related Material