Variational Inference with Locally Enhanced Bounds for Hierarchical Models

Tomas Geffner, Justin Domke
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:7310-7323, 2022.

Abstract

Hierarchical models represent a challenging setting for inference algorithms. MCMC methods struggle to scale to large models with many local variables and observations, and variational inference (VI) may fail to provide accurate approximations due to the use of simple variational families. Some variational methods (e.g. importance weighted VI) integrate Monte Carlo methods to give better accuracy, but these tend to be unsuitable for hierarchical models, as they do not allow for subsampling and their performance tends to degrade for high dimensional models. We propose a new family of variational bounds for hierarchical models, based on the application of tightening methods (e.g. importance weighting) separately for each group of local random variables. We show that our approach naturally allows the use of subsampling to get unbiased gradients, and that it fully leverages the power of methods that build tighter lower bounds by applying them independently in lower dimensional spaces, leading to better results and more accurate posterior approximations than relevant baselines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-geffner22a, title = {Variational Inference with Locally Enhanced Bounds for Hierarchical Models}, author = {Geffner, Tomas and Domke, Justin}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {7310--7323}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/geffner22a/geffner22a.pdf}, url = {https://proceedings.mlr.press/v162/geffner22a.html}, abstract = {Hierarchical models represent a challenging setting for inference algorithms. MCMC methods struggle to scale to large models with many local variables and observations, and variational inference (VI) may fail to provide accurate approximations due to the use of simple variational families. Some variational methods (e.g. importance weighted VI) integrate Monte Carlo methods to give better accuracy, but these tend to be unsuitable for hierarchical models, as they do not allow for subsampling and their performance tends to degrade for high dimensional models. We propose a new family of variational bounds for hierarchical models, based on the application of tightening methods (e.g. importance weighting) separately for each group of local random variables. We show that our approach naturally allows the use of subsampling to get unbiased gradients, and that it fully leverages the power of methods that build tighter lower bounds by applying them independently in lower dimensional spaces, leading to better results and more accurate posterior approximations than relevant baselines.} }
Endnote
%0 Conference Paper %T Variational Inference with Locally Enhanced Bounds for Hierarchical Models %A Tomas Geffner %A Justin Domke %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-geffner22a %I PMLR %P 7310--7323 %U https://proceedings.mlr.press/v162/geffner22a.html %V 162 %X Hierarchical models represent a challenging setting for inference algorithms. MCMC methods struggle to scale to large models with many local variables and observations, and variational inference (VI) may fail to provide accurate approximations due to the use of simple variational families. Some variational methods (e.g. importance weighted VI) integrate Monte Carlo methods to give better accuracy, but these tend to be unsuitable for hierarchical models, as they do not allow for subsampling and their performance tends to degrade for high dimensional models. We propose a new family of variational bounds for hierarchical models, based on the application of tightening methods (e.g. importance weighting) separately for each group of local random variables. We show that our approach naturally allows the use of subsampling to get unbiased gradients, and that it fully leverages the power of methods that build tighter lower bounds by applying them independently in lower dimensional spaces, leading to better results and more accurate posterior approximations than relevant baselines.
APA
Geffner, T. & Domke, J.. (2022). Variational Inference with Locally Enhanced Bounds for Hierarchical Models. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:7310-7323 Available from https://proceedings.mlr.press/v162/geffner22a.html.

Related Material