An Exploration of Acquisition and Mean Functions in Variational Bayesian Monte Carlo
Proceedings of The 1st Symposium on Advances in Approximate Bayesian Inference, PMLR 96:1-10, 2019.
Variational Bayesian Monte Carlo (VBMC) is a novel framework for tackling approximate posterior and model inference in models with black-box, expensive likelihoods by means of a sample-efficient approach (Acerbi, 2018). VBMC combines variational inference with Gaussian-process (GP) based, active-sampling Bayesian quadrature, using the latter to efficiently approximate the intractable integral in the variational objective. VBMC has been shown to outperform state-of-the-art inference methods for expensive likelihoods on a benchmark consisting of meaningful synthetic densities and a real model-fitting problem from computational neuroscience. In this paper, we study the performance of VBMC under variations of two key components of the framework. First, we propose and evaluate a new general family of acquisition functions for active sampling, which includes as special cases the acquisition functions used in the original work. Second, we test different mean functions for the GP surrogate, including a novel squared-exponential GP mean function. From our empirical study, we derive insights about the stability of the current VBMC algorithm, which may help inform future theoretical and applied developments of the method.