Geometric rates of convergence for kernel-based sampling algorithms

Rajiv Khanna, Liam Hodgkinson, Michael W. Mahoney
Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, PMLR 161:2156-2164, 2021.

Abstract

The rate of convergence of weighted kernel herding (WKH) and sequential Bayesian quadrature (SBQ), two kernel-based sampling algorithms for estimating integrals with respect to some target probability measure, is investigated. Under verifiable conditions on the chosen kernel and target measure, we establish a near-geometric rate of convergence for target measures that are nearly atomic. Furthermore, we show these algorithms perform comparably to the theoretical best possible sampling algorithm under the maximum mean discrepancy. An analysis is also conducted in a distributed setting. Our theoretical developments are supported by empirical observations on simulated data as well as a real world application.

Cite this Paper


BibTeX
@InProceedings{pmlr-v161-khanna21a, title = {Geometric rates of convergence for kernel-based sampling algorithms}, author = {Khanna, Rajiv and Hodgkinson, Liam and Mahoney, Michael W.}, booktitle = {Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence}, pages = {2156--2164}, year = {2021}, editor = {de Campos, Cassio and Maathuis, Marloes H.}, volume = {161}, series = {Proceedings of Machine Learning Research}, month = {27--30 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v161/khanna21a/khanna21a.pdf}, url = {https://proceedings.mlr.press/v161/khanna21a.html}, abstract = {The rate of convergence of weighted kernel herding (WKH) and sequential Bayesian quadrature (SBQ), two kernel-based sampling algorithms for estimating integrals with respect to some target probability measure, is investigated. Under verifiable conditions on the chosen kernel and target measure, we establish a near-geometric rate of convergence for target measures that are nearly atomic. Furthermore, we show these algorithms perform comparably to the theoretical best possible sampling algorithm under the maximum mean discrepancy. An analysis is also conducted in a distributed setting. Our theoretical developments are supported by empirical observations on simulated data as well as a real world application.} }
Endnote
%0 Conference Paper %T Geometric rates of convergence for kernel-based sampling algorithms %A Rajiv Khanna %A Liam Hodgkinson %A Michael W. Mahoney %B Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2021 %E Cassio de Campos %E Marloes H. Maathuis %F pmlr-v161-khanna21a %I PMLR %P 2156--2164 %U https://proceedings.mlr.press/v161/khanna21a.html %V 161 %X The rate of convergence of weighted kernel herding (WKH) and sequential Bayesian quadrature (SBQ), two kernel-based sampling algorithms for estimating integrals with respect to some target probability measure, is investigated. Under verifiable conditions on the chosen kernel and target measure, we establish a near-geometric rate of convergence for target measures that are nearly atomic. Furthermore, we show these algorithms perform comparably to the theoretical best possible sampling algorithm under the maximum mean discrepancy. An analysis is also conducted in a distributed setting. Our theoretical developments are supported by empirical observations on simulated data as well as a real world application.
APA
Khanna, R., Hodgkinson, L. & Mahoney, M.W.. (2021). Geometric rates of convergence for kernel-based sampling algorithms. Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 161:2156-2164 Available from https://proceedings.mlr.press/v161/khanna21a.html.

Related Material