Doubly Sparse Variational Gaussian Processes

Vincent Adam, Stefanos Eleftheriadis, Artem Artemev, Nicolas Durrande, James Hensman
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:2874-2884, 2020.

Abstract

The use of Gaussian process models is typically limited to datasets with a few tens of thousands of observations due to their complexity and memory footprint.The two most commonly used methods to overcome this limitation are 1) the variational sparse approximation which relies on inducing points and 2) the state-space equivalent formulation of Gaussian processes which can be seen as exploiting some sparsity in the precision matrix.In this work, we propose to take the best of both worlds: we show that the inducing point framework is still valid for state space models and that it can bring further computational and memory savings. Furthermore, we provide the natural gradient formulation for the proposed variational parameterisation.Finally, this work makes it possible to use the state-space formulation inside deep Gaussian process models as illustrated in one of the experiments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-adam20a, title = {Doubly Sparse Variational Gaussian Processes}, author = {Adam, Vincent and Eleftheriadis, Stefanos and Artemev, Artem and Durrande, Nicolas and Hensman, James}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {2874--2884}, year = {2020}, editor = {Silvia Chiappa and Roberto Calandra}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/adam20a/adam20a.pdf}, url = { http://proceedings.mlr.press/v108/adam20a.html }, abstract = {The use of Gaussian process models is typically limited to datasets with a few tens of thousands of observations due to their complexity and memory footprint.The two most commonly used methods to overcome this limitation are 1) the variational sparse approximation which relies on inducing points and 2) the state-space equivalent formulation of Gaussian processes which can be seen as exploiting some sparsity in the precision matrix.In this work, we propose to take the best of both worlds: we show that the inducing point framework is still valid for state space models and that it can bring further computational and memory savings. Furthermore, we provide the natural gradient formulation for the proposed variational parameterisation.Finally, this work makes it possible to use the state-space formulation inside deep Gaussian process models as illustrated in one of the experiments. } }
Endnote
%0 Conference Paper %T Doubly Sparse Variational Gaussian Processes %A Vincent Adam %A Stefanos Eleftheriadis %A Artem Artemev %A Nicolas Durrande %A James Hensman %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-adam20a %I PMLR %P 2874--2884 %U http://proceedings.mlr.press/v108/adam20a.html %V 108 %X The use of Gaussian process models is typically limited to datasets with a few tens of thousands of observations due to their complexity and memory footprint.The two most commonly used methods to overcome this limitation are 1) the variational sparse approximation which relies on inducing points and 2) the state-space equivalent formulation of Gaussian processes which can be seen as exploiting some sparsity in the precision matrix.In this work, we propose to take the best of both worlds: we show that the inducing point framework is still valid for state space models and that it can bring further computational and memory savings. Furthermore, we provide the natural gradient formulation for the proposed variational parameterisation.Finally, this work makes it possible to use the state-space formulation inside deep Gaussian process models as illustrated in one of the experiments.
APA
Adam, V., Eleftheriadis, S., Artemev, A., Durrande, N. & Hensman, J.. (2020). Doubly Sparse Variational Gaussian Processes. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:2874-2884 Available from http://proceedings.mlr.press/v108/adam20a.html .

Related Material