A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models

[edit]

Trong Nghia Hoang, Quang Minh Hoang, Bryan Kian Hsiang Low ;
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:382-391, 2016.

Abstract

This paper presents a novel distributed variational inference framework that unifies many parallel sparse Gaussian process regression (SGPR) models for scalable hyperparameter learning with big data. To achieve this, our framework exploits a structure of correlated noise process model that represents the observation noises as a finite realization of a high-order Gaussian Markov random process. By varying the Markov order and covariance function for the noise process model, different variational SGPR models result. This consequently allows the correlation structure of the noise process model to be characterized for which a particular variational SGPR model is optimal. We empirically evaluate the predictive performance and scalability of the distributed variational SGPR models unified by our framework on two real-world datasets.

Related Material