On the Sampling Problem for Kernel Quadrature

François-Xavier Briol, Chris J. Oates, Jon Cockayne, Wilson Ye Chen, Mark Girolami
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:586-595, 2017.

Abstract

The standard Kernel Quadrature method for numerical integration with random point sets (also called Bayesian Monte Carlo) is known to converge in root mean square error at a rate determined by the ratio s/d, where s and d encode the smoothness and dimension of the integrand. However, an empirical investigation reveals that the rate constant C is highly sensitive to the distribution of the random points. In contrast to standard Monte Carlo integration, for which optimal importance sampling is well-understood, the sampling distribution that minimises C for Kernel Quadrature does not admit a closed form. This paper argues that the practical choice of sampling distribution is an important open problem. One solution is considered; a novel automatic approach based on adaptive tempering and sequential Monte Carlo. Empirical results demonstrate a dramatic reduction in integration error of up to 4 orders of magnitude can be achieved with the proposed method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-briol17a, title = {On the Sampling Problem for Kernel Quadrature}, author = {Fran{\c{c}}ois-Xavier Briol and Chris J. Oates and Jon Cockayne and Wilson Ye Chen and Mark Girolami}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {586--595}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/briol17a/briol17a.pdf}, url = {https://proceedings.mlr.press/v70/briol17a.html}, abstract = {The standard Kernel Quadrature method for numerical integration with random point sets (also called Bayesian Monte Carlo) is known to converge in root mean square error at a rate determined by the ratio s/d, where s and d encode the smoothness and dimension of the integrand. However, an empirical investigation reveals that the rate constant C is highly sensitive to the distribution of the random points. In contrast to standard Monte Carlo integration, for which optimal importance sampling is well-understood, the sampling distribution that minimises C for Kernel Quadrature does not admit a closed form. This paper argues that the practical choice of sampling distribution is an important open problem. One solution is considered; a novel automatic approach based on adaptive tempering and sequential Monte Carlo. Empirical results demonstrate a dramatic reduction in integration error of up to 4 orders of magnitude can be achieved with the proposed method.} }
Endnote
%0 Conference Paper %T On the Sampling Problem for Kernel Quadrature %A François-Xavier Briol %A Chris J. Oates %A Jon Cockayne %A Wilson Ye Chen %A Mark Girolami %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-briol17a %I PMLR %P 586--595 %U https://proceedings.mlr.press/v70/briol17a.html %V 70 %X The standard Kernel Quadrature method for numerical integration with random point sets (also called Bayesian Monte Carlo) is known to converge in root mean square error at a rate determined by the ratio s/d, where s and d encode the smoothness and dimension of the integrand. However, an empirical investigation reveals that the rate constant C is highly sensitive to the distribution of the random points. In contrast to standard Monte Carlo integration, for which optimal importance sampling is well-understood, the sampling distribution that minimises C for Kernel Quadrature does not admit a closed form. This paper argues that the practical choice of sampling distribution is an important open problem. One solution is considered; a novel automatic approach based on adaptive tempering and sequential Monte Carlo. Empirical results demonstrate a dramatic reduction in integration error of up to 4 orders of magnitude can be achieved with the proposed method.
APA
Briol, F., Oates, C.J., Cockayne, J., Chen, W.Y. & Girolami, M.. (2017). On the Sampling Problem for Kernel Quadrature. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:586-595 Available from https://proceedings.mlr.press/v70/briol17a.html.

Related Material