[edit]
Sample Average Approximation for Black-Box Variational Inference
Proceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence, PMLR 244:471-498, 2024.
Abstract
Black-box variational inference (BBVI) is a general-purpose approximate inference approach that converts inference to a stochastic optimization problem. However, the difficulty of solving the BBVI optimization problem reliably and robustly using stochastic gradient methods has limited its applicability. We present a novel optimization approach for BBVI using the sample average approximation (SAA). SAA converts stochastic problems to deterministic ones by optimizing over a fixed random sample, which enables optimization tools such as quasi-Newton methods and line search that bypass the difficulties faced by stochastic gradient methods. We design an approach called "SAA for VI" that solves a sequence of SAA problems with increasing sample sizes to reliably and robustly solve BBVI problems without problem-specific tuning. We focus on quasi-Newton methods, which are well suited to problems with up to hundreds of latent variables. Our experiments show that SAA for VI simplifies the VI problem and achieves faster performance than existing methods.