Marginalising over Stationary Kernels with Bayesian Quadrature

Saad Hamid, Sebastian Schulze, Michael A. Osborne, Stephen Roberts
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:9776-9792, 2022.

Abstract

Marginalising over families of Gaussian Process kernels produces flexible model classes with well-calibrated uncertainty estimates. Existing approaches require likelihood evaluations of many kernels, rendering them prohibitively expensive for larger datasets. We propose a Bayesian Quadrature scheme to make this marginalisation more efficient and thereby more practical. Through use of maximum mean discrepancies between distributions, we define a kernel over kernels that captures invariances between Spectral Mixture (SM) Kernels. Kernel samples are selected by generalising an information-theoretic acquisition function for warped Bayesian Quadrature. We show that our framework achieves more accurate predictions with better calibrated uncertainty than state-of-the-art baselines, especially when given limited (wall-clock) time budgets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-hamid22a, title = { Marginalising over Stationary Kernels with Bayesian Quadrature }, author = {Hamid, Saad and Schulze, Sebastian and Osborne, Michael A. and Roberts, Stephen}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {9776--9792}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/hamid22a/hamid22a.pdf}, url = {https://proceedings.mlr.press/v151/hamid22a.html}, abstract = { Marginalising over families of Gaussian Process kernels produces flexible model classes with well-calibrated uncertainty estimates. Existing approaches require likelihood evaluations of many kernels, rendering them prohibitively expensive for larger datasets. We propose a Bayesian Quadrature scheme to make this marginalisation more efficient and thereby more practical. Through use of maximum mean discrepancies between distributions, we define a kernel over kernels that captures invariances between Spectral Mixture (SM) Kernels. Kernel samples are selected by generalising an information-theoretic acquisition function for warped Bayesian Quadrature. We show that our framework achieves more accurate predictions with better calibrated uncertainty than state-of-the-art baselines, especially when given limited (wall-clock) time budgets. } }
Endnote
%0 Conference Paper %T Marginalising over Stationary Kernels with Bayesian Quadrature %A Saad Hamid %A Sebastian Schulze %A Michael A. Osborne %A Stephen Roberts %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-hamid22a %I PMLR %P 9776--9792 %U https://proceedings.mlr.press/v151/hamid22a.html %V 151 %X Marginalising over families of Gaussian Process kernels produces flexible model classes with well-calibrated uncertainty estimates. Existing approaches require likelihood evaluations of many kernels, rendering them prohibitively expensive for larger datasets. We propose a Bayesian Quadrature scheme to make this marginalisation more efficient and thereby more practical. Through use of maximum mean discrepancies between distributions, we define a kernel over kernels that captures invariances between Spectral Mixture (SM) Kernels. Kernel samples are selected by generalising an information-theoretic acquisition function for warped Bayesian Quadrature. We show that our framework achieves more accurate predictions with better calibrated uncertainty than state-of-the-art baselines, especially when given limited (wall-clock) time budgets.
APA
Hamid, S., Schulze, S., Osborne, M.A. & Roberts, S.. (2022). Marginalising over Stationary Kernels with Bayesian Quadrature . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:9776-9792 Available from https://proceedings.mlr.press/v151/hamid22a.html.

Related Material