Scalable Gaussian Processes with GridStructured Eigenfunctions (GPGRIEF)
[edit]
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:14161425, 2018.
Abstract
We introduce a kernel approximation strategy that enables computation of the Gaussian process log marginal likelihood and all hyperparameter derivatives in O(p) time. Our GRIEF kernel consists of p eigenfunctions found using a Nystr{ö}m approximation from a dense Cartesian product grid of inducing points. By exploiting algebraic properties of Kronecker and KhatriRao tensor products, computational complexity of the training procedure can be practically independent of the number of inducing points. This allows us to use arbitrarily many inducing points to achieve a globally accurate kernel approximation, even in highdimensional problems. The fast likelihood evaluation enables typeI or II Bayesian inference on largescale datasets. We benchmark our algorithms on realworld problems with up to twomillion training points and 10^33 inducing points.
Related Material


