Regularized Dikin Walks for Sampling Truncated Logconcave Measures, Mixed Isoperimetry and Beyond Worst-Case Analysis

Minhui Jiang, Yuansi Chen
Proceedings of Thirty Eighth Conference on Learning Theory, PMLR 291:3017-3078, 2025.

Abstract

We study sampling from logconcave distributions truncated on polytopes, motivated by Bayesian models with indicator variables. Built on interior point methods and the Dikin walk, we analyze the mixing time of regularized Dikin walks. Our contributions include: (1) proving that the soft-threshold Dikin walk mixes in $\widetilde{O}(mn+\kappa n)$ iterations for logconcave distributions with condition number $\kappa$, dimension $n$ and $m$ linear constraints, without requiring bounded polytopes. Moreover, we introduce the regularized Dikin walk using Lewis weights and show it mixes in $\widetilde{O}(n^{2.5}+\kappa n)$; (2) extending the above mixing time guarantees to weakly log-concave truncated distributions with finite covariance matrices; and (3) going beyond worst-case mixing time analysis, we show that soft-threshold Dikin walk mixes significantly faster when $O(1)$ number of constraints intersect the high-probability mass of the distribution, improving the $\widetilde{O}(mn+\kappa n)$ upper bound to $\widetilde{O}(m + \kappa n)$. Additionally, we provide practical implementation to generate a warm initialization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v291-jiang25a, title = {Regularized Dikin Walks for Sampling Truncated Logconcave Measures, Mixed Isoperimetry and Beyond Worst-Case Analysis}, author = {Jiang, Minhui and Chen, Yuansi}, booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory}, pages = {3017--3078}, year = {2025}, editor = {Haghtalab, Nika and Moitra, Ankur}, volume = {291}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--04 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v291/main/assets/jiang25a/jiang25a.pdf}, url = {https://proceedings.mlr.press/v291/jiang25a.html}, abstract = {We study sampling from logconcave distributions truncated on polytopes, motivated by Bayesian models with indicator variables. Built on interior point methods and the Dikin walk, we analyze the mixing time of regularized Dikin walks. Our contributions include: (1) proving that the soft-threshold Dikin walk mixes in $\widetilde{O}(mn+\kappa n)$ iterations for logconcave distributions with condition number $\kappa$, dimension $n$ and $m$ linear constraints, without requiring bounded polytopes. Moreover, we introduce the regularized Dikin walk using Lewis weights and show it mixes in $\widetilde{O}(n^{2.5}+\kappa n)$; (2) extending the above mixing time guarantees to weakly log-concave truncated distributions with finite covariance matrices; and (3) going beyond worst-case mixing time analysis, we show that soft-threshold Dikin walk mixes significantly faster when $O(1)$ number of constraints intersect the high-probability mass of the distribution, improving the $\widetilde{O}(mn+\kappa n)$ upper bound to $\widetilde{O}(m + \kappa n)$. Additionally, we provide practical implementation to generate a warm initialization.} }
Endnote
%0 Conference Paper %T Regularized Dikin Walks for Sampling Truncated Logconcave Measures, Mixed Isoperimetry and Beyond Worst-Case Analysis %A Minhui Jiang %A Yuansi Chen %B Proceedings of Thirty Eighth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Nika Haghtalab %E Ankur Moitra %F pmlr-v291-jiang25a %I PMLR %P 3017--3078 %U https://proceedings.mlr.press/v291/jiang25a.html %V 291 %X We study sampling from logconcave distributions truncated on polytopes, motivated by Bayesian models with indicator variables. Built on interior point methods and the Dikin walk, we analyze the mixing time of regularized Dikin walks. Our contributions include: (1) proving that the soft-threshold Dikin walk mixes in $\widetilde{O}(mn+\kappa n)$ iterations for logconcave distributions with condition number $\kappa$, dimension $n$ and $m$ linear constraints, without requiring bounded polytopes. Moreover, we introduce the regularized Dikin walk using Lewis weights and show it mixes in $\widetilde{O}(n^{2.5}+\kappa n)$; (2) extending the above mixing time guarantees to weakly log-concave truncated distributions with finite covariance matrices; and (3) going beyond worst-case mixing time analysis, we show that soft-threshold Dikin walk mixes significantly faster when $O(1)$ number of constraints intersect the high-probability mass of the distribution, improving the $\widetilde{O}(mn+\kappa n)$ upper bound to $\widetilde{O}(m + \kappa n)$. Additionally, we provide practical implementation to generate a warm initialization.
APA
Jiang, M. & Chen, Y.. (2025). Regularized Dikin Walks for Sampling Truncated Logconcave Measures, Mixed Isoperimetry and Beyond Worst-Case Analysis. Proceedings of Thirty Eighth Conference on Learning Theory, in Proceedings of Machine Learning Research 291:3017-3078 Available from https://proceedings.mlr.press/v291/jiang25a.html.

Related Material