Fast Near-GRID Gaussian Process Regression

[edit]

Yuancheng Luo, Ramani Duraiswami ;
Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, PMLR 31:424-432, 2013.

Abstract

\emphGaussian process regression (GPR) is a powerful non-linear technique for Bayesian inference and prediction. One drawback is its O(N^3) computational complexity for both prediction and hyperparameter estimation for N input points which has led to much work in sparse GPR methods. In case that the covariance function is expressible as a \emphtensor product kernel (TPK) and the inputs form a multidimensional grid, it was shown that the costs for exact GPR can be reduced to a sub-quadratic function of N. We extend these exact fast algorithms to sparse GPR and remark on a connection to \emphGaussian process latent variable models (GPLVMs). In practice, the inputs may also violate the multidimensional grid constraints so we pose and efficiently solve missing and extra data problems for both exact and sparse grid GPR. We demonstrate our method on synthetic, text scan, and magnetic resonance imaging (MRI) data reconstructions.

Related Material