Constant-Time Predictive Distributions for Gaussian Processes

Geoff Pleiss, Jacob Gardner, Kilian Weinberger, Andrew Gordon Wilson
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:4114-4123, 2018.

Abstract

One of the most compelling features of Gaussian process (GP) regression is its ability to provide well-calibrated posterior distributions. Recent advances in inducing point methods have sped up GP marginal likelihood and posterior mean computations, leaving posterior covariance estimation and sampling as the remaining computational bottlenecks. In this paper we address these shortcomings by using the Lanczos algorithm to rapidly approximate the predictive covariance matrix. Our approach, which we refer to as LOVE (LanczOs Variance Estimates), substantially improves time and space complexity. In our experiments, LOVE computes covariances up to 2,000 times faster and draws samples 18,000 times faster than existing methods, all without sacrificing accuracy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-pleiss18a, title = {Constant-Time Predictive Distributions for {G}aussian Processes}, author = {Pleiss, Geoff and Gardner, Jacob and Weinberger, Kilian and Wilson, Andrew Gordon}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {4114--4123}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/pleiss18a/pleiss18a.pdf}, url = {https://proceedings.mlr.press/v80/pleiss18a.html}, abstract = {One of the most compelling features of Gaussian process (GP) regression is its ability to provide well-calibrated posterior distributions. Recent advances in inducing point methods have sped up GP marginal likelihood and posterior mean computations, leaving posterior covariance estimation and sampling as the remaining computational bottlenecks. In this paper we address these shortcomings by using the Lanczos algorithm to rapidly approximate the predictive covariance matrix. Our approach, which we refer to as LOVE (LanczOs Variance Estimates), substantially improves time and space complexity. In our experiments, LOVE computes covariances up to 2,000 times faster and draws samples 18,000 times faster than existing methods, all without sacrificing accuracy.} }
Endnote
%0 Conference Paper %T Constant-Time Predictive Distributions for Gaussian Processes %A Geoff Pleiss %A Jacob Gardner %A Kilian Weinberger %A Andrew Gordon Wilson %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-pleiss18a %I PMLR %P 4114--4123 %U https://proceedings.mlr.press/v80/pleiss18a.html %V 80 %X One of the most compelling features of Gaussian process (GP) regression is its ability to provide well-calibrated posterior distributions. Recent advances in inducing point methods have sped up GP marginal likelihood and posterior mean computations, leaving posterior covariance estimation and sampling as the remaining computational bottlenecks. In this paper we address these shortcomings by using the Lanczos algorithm to rapidly approximate the predictive covariance matrix. Our approach, which we refer to as LOVE (LanczOs Variance Estimates), substantially improves time and space complexity. In our experiments, LOVE computes covariances up to 2,000 times faster and draws samples 18,000 times faster than existing methods, all without sacrificing accuracy.
APA
Pleiss, G., Gardner, J., Weinberger, K. & Wilson, A.G.. (2018). Constant-Time Predictive Distributions for Gaussian Processes. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:4114-4123 Available from https://proceedings.mlr.press/v80/pleiss18a.html.

Related Material