Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition

Shengyang Sun, Jiaxin Shi, Andrew Gordon Gordon Wilson, Roger B Grosse
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:9955-9965, 2021.

Abstract

We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability. We propose the harmonic kernel decomposition (HKD), which uses Fourier series to decompose a kernel as a sum of orthogonal kernels. Our variational approximation exploits this orthogonality to enable a large number of inducing points at a low computational cost. We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections, and it significantly outperforms standard variational methods in scalability and accuracy. Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-sun21d, title = {Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition}, author = {Sun, Shengyang and Shi, Jiaxin and Wilson, Andrew Gordon Gordon and Grosse, Roger B}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {9955--9965}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/sun21d/sun21d.pdf}, url = {https://proceedings.mlr.press/v139/sun21d.html}, abstract = {We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability. We propose the harmonic kernel decomposition (HKD), which uses Fourier series to decompose a kernel as a sum of orthogonal kernels. Our variational approximation exploits this orthogonality to enable a large number of inducing points at a low computational cost. We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections, and it significantly outperforms standard variational methods in scalability and accuracy. Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.} }
Endnote
%0 Conference Paper %T Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition %A Shengyang Sun %A Jiaxin Shi %A Andrew Gordon Gordon Wilson %A Roger B Grosse %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-sun21d %I PMLR %P 9955--9965 %U https://proceedings.mlr.press/v139/sun21d.html %V 139 %X We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability. We propose the harmonic kernel decomposition (HKD), which uses Fourier series to decompose a kernel as a sum of orthogonal kernels. Our variational approximation exploits this orthogonality to enable a large number of inducing points at a low computational cost. We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections, and it significantly outperforms standard variational methods in scalability and accuracy. Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
APA
Sun, S., Shi, J., Wilson, A.G.G. & Grosse, R.B.. (2021). Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:9955-9965 Available from https://proceedings.mlr.press/v139/sun21d.html.

Related Material