Sparse Orthogonal Variational Inference for Gaussian Processes

Jiaxin Shi, Michalis Titsias, Andriy Mnih
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:1932-1942, 2020.

Abstract

We introduce a new interpretation of sparse variational approximations for Gaussian processes using inducing points, which can lead to more scalable algorithms than previous methods. It is based on decomposing a Gaussian process as a sum of two independent processes: one spanned by a finite basis of inducing points and the other capturing the remaining variation. We show that this formulation recovers existing approximations and at the same time allows to obtain tighter lower bounds on the marginal likelihood and new stochastic variational inference algorithms. We demonstrate the efficiency of these algorithms in several Gaussian process models ranging from standard regression to multi-class classification using (deep) convolutional Gaussian processes and report state-of-the-art results on CIFAR-10 among purely GP-based models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-shi20b, title = {Sparse Orthogonal Variational Inference for Gaussian Processes}, author = {Shi, Jiaxin and Titsias, Michalis and Mnih, Andriy}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {1932--1942}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/shi20b/shi20b.pdf}, url = {https://proceedings.mlr.press/v108/shi20b.html}, abstract = {We introduce a new interpretation of sparse variational approximations for Gaussian processes using inducing points, which can lead to more scalable algorithms than previous methods. It is based on decomposing a Gaussian process as a sum of two independent processes: one spanned by a finite basis of inducing points and the other capturing the remaining variation. We show that this formulation recovers existing approximations and at the same time allows to obtain tighter lower bounds on the marginal likelihood and new stochastic variational inference algorithms. We demonstrate the efficiency of these algorithms in several Gaussian process models ranging from standard regression to multi-class classification using (deep) convolutional Gaussian processes and report state-of-the-art results on CIFAR-10 among purely GP-based models.} }
Endnote
%0 Conference Paper %T Sparse Orthogonal Variational Inference for Gaussian Processes %A Jiaxin Shi %A Michalis Titsias %A Andriy Mnih %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-shi20b %I PMLR %P 1932--1942 %U https://proceedings.mlr.press/v108/shi20b.html %V 108 %X We introduce a new interpretation of sparse variational approximations for Gaussian processes using inducing points, which can lead to more scalable algorithms than previous methods. It is based on decomposing a Gaussian process as a sum of two independent processes: one spanned by a finite basis of inducing points and the other capturing the remaining variation. We show that this formulation recovers existing approximations and at the same time allows to obtain tighter lower bounds on the marginal likelihood and new stochastic variational inference algorithms. We demonstrate the efficiency of these algorithms in several Gaussian process models ranging from standard regression to multi-class classification using (deep) convolutional Gaussian processes and report state-of-the-art results on CIFAR-10 among purely GP-based models.
APA
Shi, J., Titsias, M. & Mnih, A.. (2020). Sparse Orthogonal Variational Inference for Gaussian Processes. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:1932-1942 Available from https://proceedings.mlr.press/v108/shi20b.html.

Related Material