Scalable HighOrder Gaussian Process Regression
[edit]
Proceedings of Machine Learning Research, PMLR 89:26112620, 2019.
Abstract
While most Gaussian processes (GP) work focus on learning singleoutput functions, many applications, such as physical simulations and gene expressions prediction, require estimations of functions with many outputs. The number of outputs can be much larger than or comparable to the size of training samples. Existing multioutput GP models either are limited to lowdimensional outputs and restricted kernel choices, or assume oversimplified lowrank structures within the outputs. To address these issues, we propose HOGPR, a HighOrder Gaussian Process Regression model, which can flexibly capture complex correlations among the outputs and scale up to a large number of outputs. Specifically, we tensorize the highdimensional outputs, introducing latent coordinate features to index each tensor element (i.e., output) and to capture their correlations. We then generalize a multilinear model to a hybrid of a GP and latent GP model. The model is endowed with a Kronecker product structure over the inputs and the latent features. Using the Kronecker product properties and tensor algebra, we are able to perform exact inference over millions of outputs. We show the advantage of the proposed model on several realworld applications.
Related Material


