A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models

Trong Nghia Hoang, Quang Minh Hoang, Bryan Kian Hsiang Low
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:382-391, 2016.

Abstract

This paper presents a novel distributed variational inference framework that unifies many parallel sparse Gaussian process regression (SGPR) models for scalable hyperparameter learning with big data. To achieve this, our framework exploits a structure of correlated noise process model that represents the observation noises as a finite realization of a high-order Gaussian Markov random process. By varying the Markov order and covariance function for the noise process model, different variational SGPR models result. This consequently allows the correlation structure of the noise process model to be characterized for which a particular variational SGPR model is optimal. We empirically evaluate the predictive performance and scalability of the distributed variational SGPR models unified by our framework on two real-world datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-hoang16, title = {A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models}, author = {Hoang, Trong Nghia and Hoang, Quang Minh and Low, Bryan Kian Hsiang}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {382--391}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/hoang16.pdf}, url = {https://proceedings.mlr.press/v48/hoang16.html}, abstract = {This paper presents a novel distributed variational inference framework that unifies many parallel sparse Gaussian process regression (SGPR) models for scalable hyperparameter learning with big data. To achieve this, our framework exploits a structure of correlated noise process model that represents the observation noises as a finite realization of a high-order Gaussian Markov random process. By varying the Markov order and covariance function for the noise process model, different variational SGPR models result. This consequently allows the correlation structure of the noise process model to be characterized for which a particular variational SGPR model is optimal. We empirically evaluate the predictive performance and scalability of the distributed variational SGPR models unified by our framework on two real-world datasets.} }
Endnote
%0 Conference Paper %T A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models %A Trong Nghia Hoang %A Quang Minh Hoang %A Bryan Kian Hsiang Low %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-hoang16 %I PMLR %P 382--391 %U https://proceedings.mlr.press/v48/hoang16.html %V 48 %X This paper presents a novel distributed variational inference framework that unifies many parallel sparse Gaussian process regression (SGPR) models for scalable hyperparameter learning with big data. To achieve this, our framework exploits a structure of correlated noise process model that represents the observation noises as a finite realization of a high-order Gaussian Markov random process. By varying the Markov order and covariance function for the noise process model, different variational SGPR models result. This consequently allows the correlation structure of the noise process model to be characterized for which a particular variational SGPR model is optimal. We empirically evaluate the predictive performance and scalability of the distributed variational SGPR models unified by our framework on two real-world datasets.
RIS
TY - CPAPER TI - A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models AU - Trong Nghia Hoang AU - Quang Minh Hoang AU - Bryan Kian Hsiang Low BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-hoang16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 382 EP - 391 L1 - http://proceedings.mlr.press/v48/hoang16.pdf UR - https://proceedings.mlr.press/v48/hoang16.html AB - This paper presents a novel distributed variational inference framework that unifies many parallel sparse Gaussian process regression (SGPR) models for scalable hyperparameter learning with big data. To achieve this, our framework exploits a structure of correlated noise process model that represents the observation noises as a finite realization of a high-order Gaussian Markov random process. By varying the Markov order and covariance function for the noise process model, different variational SGPR models result. This consequently allows the correlation structure of the noise process model to be characterized for which a particular variational SGPR model is optimal. We empirically evaluate the predictive performance and scalability of the distributed variational SGPR models unified by our framework on two real-world datasets. ER -
APA
Hoang, T.N., Hoang, Q.M. & Low, B.K.H.. (2016). A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:382-391 Available from https://proceedings.mlr.press/v48/hoang16.html.

Related Material