Distributed Gaussian Processes

Marc Deisenroth, Jun Wei Ng
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1481-1490, 2015.

Abstract

To scale Gaussian processes (GPs) to large data sets we introduce the robust Bayesian Committee Machine (rBCM), a practical and scalable product-of-experts model for large-scale distributed GP regression. Unlike state-of-the-art sparse GP approximations, the rBCM is conceptually simple and does not rely on inducing or variational parameters. The key idea is to recursively distribute computations to independent computational units and, subsequently, recombine them to form an overall result. Efficient closed-form inference allows for straightforward parallelisation and distributed computations with a small memory footprint. The rBCM is independent of the computational graph and can be used on heterogeneous computing infrastructures, ranging from laptops to clusters. With sufficient computing resources our distributed GP model can handle arbitrarily large data sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-deisenroth15, title = {Distributed Gaussian Processes}, author = {Deisenroth, Marc and Ng, Jun Wei}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1481--1490}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/deisenroth15.pdf}, url = {https://proceedings.mlr.press/v37/deisenroth15.html}, abstract = {To scale Gaussian processes (GPs) to large data sets we introduce the robust Bayesian Committee Machine (rBCM), a practical and scalable product-of-experts model for large-scale distributed GP regression. Unlike state-of-the-art sparse GP approximations, the rBCM is conceptually simple and does not rely on inducing or variational parameters. The key idea is to recursively distribute computations to independent computational units and, subsequently, recombine them to form an overall result. Efficient closed-form inference allows for straightforward parallelisation and distributed computations with a small memory footprint. The rBCM is independent of the computational graph and can be used on heterogeneous computing infrastructures, ranging from laptops to clusters. With sufficient computing resources our distributed GP model can handle arbitrarily large data sets.} }
Endnote
%0 Conference Paper %T Distributed Gaussian Processes %A Marc Deisenroth %A Jun Wei Ng %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-deisenroth15 %I PMLR %P 1481--1490 %U https://proceedings.mlr.press/v37/deisenroth15.html %V 37 %X To scale Gaussian processes (GPs) to large data sets we introduce the robust Bayesian Committee Machine (rBCM), a practical and scalable product-of-experts model for large-scale distributed GP regression. Unlike state-of-the-art sparse GP approximations, the rBCM is conceptually simple and does not rely on inducing or variational parameters. The key idea is to recursively distribute computations to independent computational units and, subsequently, recombine them to form an overall result. Efficient closed-form inference allows for straightforward parallelisation and distributed computations with a small memory footprint. The rBCM is independent of the computational graph and can be used on heterogeneous computing infrastructures, ranging from laptops to clusters. With sufficient computing resources our distributed GP model can handle arbitrarily large data sets.
RIS
TY - CPAPER TI - Distributed Gaussian Processes AU - Marc Deisenroth AU - Jun Wei Ng BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-deisenroth15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 1481 EP - 1490 L1 - http://proceedings.mlr.press/v37/deisenroth15.pdf UR - https://proceedings.mlr.press/v37/deisenroth15.html AB - To scale Gaussian processes (GPs) to large data sets we introduce the robust Bayesian Committee Machine (rBCM), a practical and scalable product-of-experts model for large-scale distributed GP regression. Unlike state-of-the-art sparse GP approximations, the rBCM is conceptually simple and does not rely on inducing or variational parameters. The key idea is to recursively distribute computations to independent computational units and, subsequently, recombine them to form an overall result. Efficient closed-form inference allows for straightforward parallelisation and distributed computations with a small memory footprint. The rBCM is independent of the computational graph and can be used on heterogeneous computing infrastructures, ranging from laptops to clusters. With sufficient computing resources our distributed GP model can handle arbitrarily large data sets. ER -
APA
Deisenroth, M. & Ng, J.W.. (2015). Distributed Gaussian Processes. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:1481-1490 Available from https://proceedings.mlr.press/v37/deisenroth15.html.

Related Material