Improving Quadrature for Constrained Integrands
[edit]
Proceedings of Machine Learning Research, PMLR 89:27512759, 2019.
Abstract
We present an improved Bayesian framework for performing inference of affine transformations of constrained functions. We focus on quadrature with nonnegative functions, a common task in Bayesian inference. We consider constraints on the range of the function of interest, such as nonnegativity or boundedness. Although our framework is general, we derive explicit approximation schemes for these constraints, and argue for the use of a log transformation for functions with high dynamic range such as likelihood surfaces. We propose a novel method for optimizing hyperparameters in this framework: we optimize the marginal likelihood in the original space, as opposed to in the transformed space. The result is a model that better explains the actual data. Experiments on synthetic and realworld data demonstrate our framework achieves superior estimates using less wallclock time than existing Bayesian quadrature procedures.
Related Material


