Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein space

Yiheng Jiang, Sinho Chewi, Aram-Alexandre Pooladian
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:2720-2721, 2024.

Abstract

We develop a theory of finite-dimensional polyhedral subsets over the Wasserstein space and optimization of functionals over them via first-order methods. Our main application is to the problem of mean-field variational inference, which seeks to approximate a distribution $\pi$ over $\mathbb{R}^d$ by a product measure $\pi^\star$. When $\pi$ is strongly log-concave and log-smooth, we provide (1) approximation rates certifying that $\pi^\star$ is close to the minimizer $\pi^\star_\diamond$ of the KL divergence over a \emph{polyhedral} set $\mathcal{P}_\diamond$, and (2) an algorithm for minimizing $\text{KL}(\cdot\|\pi)$ over $\mathcal{P}_\diamond$ with accelerated complexity $O(\sqrt \kappa \log(\kappa d/\varepsilon^2))$, where $\kappa$ is the condition number of $\pi$.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-jiang24a, title = {Algorithms for mean-field variational inference via polyhedral optimization in the {W}asserstein space}, author = {Jiang, Yiheng and Chewi, Sinho and Pooladian, Aram-Alexandre}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {2720--2721}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/jiang24a/jiang24a.pdf}, url = {https://proceedings.mlr.press/v247/jiang24a.html}, abstract = {We develop a theory of finite-dimensional polyhedral subsets over the Wasserstein space and optimization of functionals over them via first-order methods. Our main application is to the problem of mean-field variational inference, which seeks to approximate a distribution $\pi$ over $\mathbb{R}^d$ by a product measure $\pi^\star$. When $\pi$ is strongly log-concave and log-smooth, we provide (1) approximation rates certifying that $\pi^\star$ is close to the minimizer $\pi^\star_\diamond$ of the KL divergence over a \emph{polyhedral} set $\mathcal{P}_\diamond$, and (2) an algorithm for minimizing $\text{KL}(\cdot\|\pi)$ over $\mathcal{P}_\diamond$ with accelerated complexity $O(\sqrt \kappa \log(\kappa d/\varepsilon^2))$, where $\kappa$ is the condition number of $\pi$. } }
Endnote
%0 Conference Paper %T Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein space %A Yiheng Jiang %A Sinho Chewi %A Aram-Alexandre Pooladian %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-jiang24a %I PMLR %P 2720--2721 %U https://proceedings.mlr.press/v247/jiang24a.html %V 247 %X We develop a theory of finite-dimensional polyhedral subsets over the Wasserstein space and optimization of functionals over them via first-order methods. Our main application is to the problem of mean-field variational inference, which seeks to approximate a distribution $\pi$ over $\mathbb{R}^d$ by a product measure $\pi^\star$. When $\pi$ is strongly log-concave and log-smooth, we provide (1) approximation rates certifying that $\pi^\star$ is close to the minimizer $\pi^\star_\diamond$ of the KL divergence over a \emph{polyhedral} set $\mathcal{P}_\diamond$, and (2) an algorithm for minimizing $\text{KL}(\cdot\|\pi)$ over $\mathcal{P}_\diamond$ with accelerated complexity $O(\sqrt \kappa \log(\kappa d/\varepsilon^2))$, where $\kappa$ is the condition number of $\pi$.
APA
Jiang, Y., Chewi, S. & Pooladian, A.. (2024). Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein space. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:2720-2721 Available from https://proceedings.mlr.press/v247/jiang24a.html.

Related Material