Variational Gaussian Process Models without Matrix Inverses

Mark van der Wilk, ST John, Artem Artemev, James Hensman
Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, PMLR 118:1-9, 2020.

Abstract

In this work, we provide a variational lower bound that can be computed without expensive matrix operations like inversion. Our bound can be used as a drop-in replacement to the existing variational method of Hensman et al. (2013, 2015), and can therefore directly be applied in a wide variety of models, such as deep GPs (Damianou and Lawrence, 2013). We focus on the theoretical properties of this new bound, and show some initial experimental results for optimising this bound. We hope to realise the full promise in scalability that this new bound has in future work.

Cite this Paper


BibTeX
@InProceedings{pmlr-v118-wilk20a, title = { Variational Gaussian Process Models without Matrix Inverses}, author = {van der Wilk, Mark and John, ST and Artemev, Artem and Hensman, James}, booktitle = {Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference}, pages = {1--9}, year = {2020}, editor = {Zhang, Cheng and Ruiz, Francisco and Bui, Thang and Dieng, Adji Bousso and Liang, Dawen}, volume = {118}, series = {Proceedings of Machine Learning Research}, month = {08 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v118/wilk20a/wilk20a.pdf}, url = {https://proceedings.mlr.press/v118/wilk20a.html}, abstract = {In this work, we provide a variational lower bound that can be computed without expensive matrix operations like inversion. Our bound can be used as a drop-in replacement to the existing variational method of Hensman et al. (2013, 2015), and can therefore directly be applied in a wide variety of models, such as deep GPs (Damianou and Lawrence, 2013). We focus on the theoretical properties of this new bound, and show some initial experimental results for optimising this bound. We hope to realise the full promise in scalability that this new bound has in future work.} }
Endnote
%0 Conference Paper %T Variational Gaussian Process Models without Matrix Inverses %A Mark van der Wilk %A ST John %A Artem Artemev %A James Hensman %B Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference %C Proceedings of Machine Learning Research %D 2020 %E Cheng Zhang %E Francisco Ruiz %E Thang Bui %E Adji Bousso Dieng %E Dawen Liang %F pmlr-v118-wilk20a %I PMLR %P 1--9 %U https://proceedings.mlr.press/v118/wilk20a.html %V 118 %X In this work, we provide a variational lower bound that can be computed without expensive matrix operations like inversion. Our bound can be used as a drop-in replacement to the existing variational method of Hensman et al. (2013, 2015), and can therefore directly be applied in a wide variety of models, such as deep GPs (Damianou and Lawrence, 2013). We focus on the theoretical properties of this new bound, and show some initial experimental results for optimising this bound. We hope to realise the full promise in scalability that this new bound has in future work.
APA
van der Wilk, M., John, S., Artemev, A. & Hensman, J.. (2020). Variational Gaussian Process Models without Matrix Inverses. Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, in Proceedings of Machine Learning Research 118:1-9 Available from https://proceedings.mlr.press/v118/wilk20a.html.

Related Material