[edit]
Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein space
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:2720-2721, 2024.
Abstract
We develop a theory of finite-dimensional polyhedral subsets over the Wasserstein space and optimization of functionals over them via first-order methods. Our main application is to the problem of mean-field variational inference, which seeks to approximate a distribution π over Rd by a product measure π⋆. When π is strongly log-concave and log-smooth, we provide (1) approximation rates certifying that π⋆ is close to the minimizer π⋆⋄ of the KL divergence over a \emph{polyhedral} set P⋄, and (2) an algorithm for minimizing KL(⋅‖ over \mathcal{P}_\diamond with accelerated complexity O(\sqrt \kappa \log(\kappa d/\varepsilon^2)), where \kappa is the condition number of \pi.