Product Kernel Interpolation for Scalable Gaussian Processes

Jacob Gardner, Geoff Pleiss, Ruihan Wu, Kilian Weinberger, Andrew Wilson
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1407-1416, 2018.

Abstract

Recent work shows that inference for Gaussian processes can be performed efficiently using iterative methods that rely only on matrix-vector multiplications (MVMs). Structured Kernel Interpolation (SKI) exploits these techniques by deriving approximate kernels with very fast MVMs. Unfortunately, such strategies suffer badly from the curse of dimensionality. We develop a new technique for MVM based learning that exploits product kernel structure. We demonstrate that this technique is broadly applicable, resulting in linear rather than exponential runtime with dimension for SKI, as well as state-of-the-art asymptotic complexity for multi-task GPs

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-gardner18a, title = {Product Kernel Interpolation for Scalable Gaussian Processes}, author = {Gardner, Jacob and Pleiss, Geoff and Wu, Ruihan and Weinberger, Kilian and Wilson, Andrew}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {1407--1416}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/gardner18a/gardner18a.pdf}, url = {https://proceedings.mlr.press/v84/gardner18a.html}, abstract = {Recent work shows that inference for Gaussian processes can be performed efficiently using iterative methods that rely only on matrix-vector multiplications (MVMs). Structured Kernel Interpolation (SKI) exploits these techniques by deriving approximate kernels with very fast MVMs. Unfortunately, such strategies suffer badly from the curse of dimensionality. We develop a new technique for MVM based learning that exploits product kernel structure. We demonstrate that this technique is broadly applicable, resulting in linear rather than exponential runtime with dimension for SKI, as well as state-of-the-art asymptotic complexity for multi-task GPs} }
Endnote
%0 Conference Paper %T Product Kernel Interpolation for Scalable Gaussian Processes %A Jacob Gardner %A Geoff Pleiss %A Ruihan Wu %A Kilian Weinberger %A Andrew Wilson %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-gardner18a %I PMLR %P 1407--1416 %U https://proceedings.mlr.press/v84/gardner18a.html %V 84 %X Recent work shows that inference for Gaussian processes can be performed efficiently using iterative methods that rely only on matrix-vector multiplications (MVMs). Structured Kernel Interpolation (SKI) exploits these techniques by deriving approximate kernels with very fast MVMs. Unfortunately, such strategies suffer badly from the curse of dimensionality. We develop a new technique for MVM based learning that exploits product kernel structure. We demonstrate that this technique is broadly applicable, resulting in linear rather than exponential runtime with dimension for SKI, as well as state-of-the-art asymptotic complexity for multi-task GPs
APA
Gardner, J., Pleiss, G., Wu, R., Weinberger, K. & Wilson, A.. (2018). Product Kernel Interpolation for Scalable Gaussian Processes. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:1407-1416 Available from https://proceedings.mlr.press/v84/gardner18a.html.

Related Material