An Exploration of Acquisition and Mean Functions in Variational Bayesian Monte Carlo

Luigi Acerbi
Proceedings of The 1st Symposium on Advances in Approximate Bayesian Inference, PMLR 96:1-10, 2019.

Abstract

Variational Bayesian Monte Carlo (VBMC) is a novel framework for tackling approximate posterior and model inference in models with black-box, expensive likelihoods by means of a sample-efficient approach (Acerbi, 2018). VBMC combines variational inference with Gaussian-process (GP) based, active-sampling Bayesian quadrature, using the latter to efficiently approximate the intractable integral in the variational objective. VBMC has been shown to outperform state-of-the-art inference methods for expensive likelihoods on a benchmark consisting of meaningful synthetic densities and a real model-fitting problem from computational neuroscience. In this paper, we study the performance of VBMC under variations of two key components of the framework. First, we propose and evaluate a new general family of acquisition functions for active sampling, which includes as special cases the acquisition functions used in the original work. Second, we test different mean functions for the GP surrogate, including a novel squared-exponential GP mean function. From our empirical study, we derive insights about the stability of the current VBMC algorithm, which may help inform future theoretical and applied developments of the method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v96-acerbi19a, title = {An Exploration of Acquisition and Mean Functions in Variational Bayesian Monte Carlo}, author = {Acerbi, Luigi}, booktitle = {Proceedings of The 1st Symposium on Advances in Approximate Bayesian Inference}, pages = {1--10}, year = {2019}, editor = {Ruiz, Francisco and Zhang, Cheng and Liang, Dawen and Bui, Thang}, volume = {96}, series = {Proceedings of Machine Learning Research}, month = {02 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v96/acerbi19a/acerbi19a.pdf}, url = {https://proceedings.mlr.press/v96/acerbi19a.html}, abstract = {Variational Bayesian Monte Carlo (VBMC) is a novel framework for tackling approximate posterior and model inference in models with black-box, expensive likelihoods by means of a sample-efficient approach (Acerbi, 2018). VBMC combines variational inference with Gaussian-process (GP) based, active-sampling Bayesian quadrature, using the latter to efficiently approximate the intractable integral in the variational objective. VBMC has been shown to outperform state-of-the-art inference methods for expensive likelihoods on a benchmark consisting of meaningful synthetic densities and a real model-fitting problem from computational neuroscience. In this paper, we study the performance of VBMC under variations of two key components of the framework. First, we propose and evaluate a new general family of acquisition functions for active sampling, which includes as special cases the acquisition functions used in the original work. Second, we test different mean functions for the GP surrogate, including a novel squared-exponential GP mean function. From our empirical study, we derive insights about the stability of the current VBMC algorithm, which may help inform future theoretical and applied developments of the method.} }
Endnote
%0 Conference Paper %T An Exploration of Acquisition and Mean Functions in Variational Bayesian Monte Carlo %A Luigi Acerbi %B Proceedings of The 1st Symposium on Advances in Approximate Bayesian Inference %C Proceedings of Machine Learning Research %D 2019 %E Francisco Ruiz %E Cheng Zhang %E Dawen Liang %E Thang Bui %F pmlr-v96-acerbi19a %I PMLR %P 1--10 %U https://proceedings.mlr.press/v96/acerbi19a.html %V 96 %X Variational Bayesian Monte Carlo (VBMC) is a novel framework for tackling approximate posterior and model inference in models with black-box, expensive likelihoods by means of a sample-efficient approach (Acerbi, 2018). VBMC combines variational inference with Gaussian-process (GP) based, active-sampling Bayesian quadrature, using the latter to efficiently approximate the intractable integral in the variational objective. VBMC has been shown to outperform state-of-the-art inference methods for expensive likelihoods on a benchmark consisting of meaningful synthetic densities and a real model-fitting problem from computational neuroscience. In this paper, we study the performance of VBMC under variations of two key components of the framework. First, we propose and evaluate a new general family of acquisition functions for active sampling, which includes as special cases the acquisition functions used in the original work. Second, we test different mean functions for the GP surrogate, including a novel squared-exponential GP mean function. From our empirical study, we derive insights about the stability of the current VBMC algorithm, which may help inform future theoretical and applied developments of the method.
APA
Acerbi, L.. (2019). An Exploration of Acquisition and Mean Functions in Variational Bayesian Monte Carlo. Proceedings of The 1st Symposium on Advances in Approximate Bayesian Inference, in Proceedings of Machine Learning Research 96:1-10 Available from https://proceedings.mlr.press/v96/acerbi19a.html.

Related Material