Generalized Robust Bayesian Committee Machine for Large-scale Gaussian Process Regression

Haitao Liu, Jianfei Cai, Yi Wang, Yew Soon Ong
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3131-3140, 2018.

Abstract

In order to scale standard Gaussian process (GP) regression to large-scale datasets, aggregation models employ factorized training process and then combine predictions from distributed experts. The state-of-the-art aggregation models, however, either provide inconsistent predictions or require time-consuming aggregation process. We first prove the inconsistency of typical aggregations using disjoint or random data partition, and then present a consistent yet efficient aggregation model for large-scale GP. The proposed model inherits the advantages of aggregations, e.g., closed-form inference and aggregation, parallelization and distributed computing. Furthermore, theoretical and empirical analyses reveal that the new aggregation model performs better due to the consistent predictions that converge to the true underlying function when the training size approaches infinity.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-liu18a, title = {Generalized Robust {B}ayesian Committee Machine for Large-scale {G}aussian Process Regression}, author = {Liu, Haitao and Cai, Jianfei and Wang, Yi and Ong, Yew Soon}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3131--3140}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/liu18a/liu18a.pdf}, url = {http://proceedings.mlr.press/v80/liu18a.html}, abstract = {In order to scale standard Gaussian process (GP) regression to large-scale datasets, aggregation models employ factorized training process and then combine predictions from distributed experts. The state-of-the-art aggregation models, however, either provide inconsistent predictions or require time-consuming aggregation process. We first prove the inconsistency of typical aggregations using disjoint or random data partition, and then present a consistent yet efficient aggregation model for large-scale GP. The proposed model inherits the advantages of aggregations, e.g., closed-form inference and aggregation, parallelization and distributed computing. Furthermore, theoretical and empirical analyses reveal that the new aggregation model performs better due to the consistent predictions that converge to the true underlying function when the training size approaches infinity.} }
Endnote
%0 Conference Paper %T Generalized Robust Bayesian Committee Machine for Large-scale Gaussian Process Regression %A Haitao Liu %A Jianfei Cai %A Yi Wang %A Yew Soon Ong %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-liu18a %I PMLR %P 3131--3140 %U http://proceedings.mlr.press/v80/liu18a.html %V 80 %X In order to scale standard Gaussian process (GP) regression to large-scale datasets, aggregation models employ factorized training process and then combine predictions from distributed experts. The state-of-the-art aggregation models, however, either provide inconsistent predictions or require time-consuming aggregation process. We first prove the inconsistency of typical aggregations using disjoint or random data partition, and then present a consistent yet efficient aggregation model for large-scale GP. The proposed model inherits the advantages of aggregations, e.g., closed-form inference and aggregation, parallelization and distributed computing. Furthermore, theoretical and empirical analyses reveal that the new aggregation model performs better due to the consistent predictions that converge to the true underlying function when the training size approaches infinity.
APA
Liu, H., Cai, J., Wang, Y. & Ong, Y.S.. (2018). Generalized Robust Bayesian Committee Machine for Large-scale Gaussian Process Regression. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3131-3140 Available from http://proceedings.mlr.press/v80/liu18a.html.

Related Material