Approximate Inference for Fully Bayesian Gaussian Process Regression

Vidhi Lalchand, Carl Edward Rasmussen
Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, PMLR 118:1-12, 2020.

Abstract

Learning in Gaussian Process models occurs through the adaptation of hyperparameters of the mean and the covariance function. The classical approach entails maximizing the marginal likelihood yielding fixed point estimates (an approach called Type II maximum likelihood or ML-II). An alternative learning procedure is to infer the posterior over hyper-parameters in a hierarchical specication of GPs we call Fully Bayesian Gaussian Process Regression (GPR). This work considers two approximation schemes for the intractable hyperparameter posterior: 1) Hamiltonian Monte Carlo (HMC) yielding a sampling based approximation and 2) Variational Inference (VI) where the posterior over hyperparameters is approximated by a factorized Gaussian (mean-field) or a full-rank Gaussian accounting for correlations between hyperparameters. We analyse the predictive performance for fully Bayesian GPR on a range of benchmark data sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v118-lalchand20a, title = { Approximate Inference for Fully Bayesian Gaussian Process Regression }, author = {Lalchand, Vidhi and Rasmussen, Carl Edward}, booktitle = {Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference}, pages = {1--12}, year = {2020}, editor = {Zhang, Cheng and Ruiz, Francisco and Bui, Thang and Dieng, Adji Bousso and Liang, Dawen}, volume = {118}, series = {Proceedings of Machine Learning Research}, month = {08 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v118/lalchand20a/lalchand20a.pdf}, url = {https://proceedings.mlr.press/v118/lalchand20a.html}, abstract = { Learning in Gaussian Process models occurs through the adaptation of hyperparameters of the mean and the covariance function. The classical approach entails maximizing the marginal likelihood yielding fixed point estimates (an approach called Type II maximum likelihood or ML-II). An alternative learning procedure is to infer the posterior over hyper-parameters in a hierarchical specication of GPs we call Fully Bayesian Gaussian Process Regression (GPR). This work considers two approximation schemes for the intractable hyperparameter posterior: 1) Hamiltonian Monte Carlo (HMC) yielding a sampling based approximation and 2) Variational Inference (VI) where the posterior over hyperparameters is approximated by a factorized Gaussian (mean-field) or a full-rank Gaussian accounting for correlations between hyperparameters. We analyse the predictive performance for fully Bayesian GPR on a range of benchmark data sets.} }
Endnote
%0 Conference Paper %T Approximate Inference for Fully Bayesian Gaussian Process Regression %A Vidhi Lalchand %A Carl Edward Rasmussen %B Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference %C Proceedings of Machine Learning Research %D 2020 %E Cheng Zhang %E Francisco Ruiz %E Thang Bui %E Adji Bousso Dieng %E Dawen Liang %F pmlr-v118-lalchand20a %I PMLR %P 1--12 %U https://proceedings.mlr.press/v118/lalchand20a.html %V 118 %X Learning in Gaussian Process models occurs through the adaptation of hyperparameters of the mean and the covariance function. The classical approach entails maximizing the marginal likelihood yielding fixed point estimates (an approach called Type II maximum likelihood or ML-II). An alternative learning procedure is to infer the posterior over hyper-parameters in a hierarchical specication of GPs we call Fully Bayesian Gaussian Process Regression (GPR). This work considers two approximation schemes for the intractable hyperparameter posterior: 1) Hamiltonian Monte Carlo (HMC) yielding a sampling based approximation and 2) Variational Inference (VI) where the posterior over hyperparameters is approximated by a factorized Gaussian (mean-field) or a full-rank Gaussian accounting for correlations between hyperparameters. We analyse the predictive performance for fully Bayesian GPR on a range of benchmark data sets.
APA
Lalchand, V. & Rasmussen, C.E.. (2020). Approximate Inference for Fully Bayesian Gaussian Process Regression . Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, in Proceedings of Machine Learning Research 118:1-12 Available from https://proceedings.mlr.press/v118/lalchand20a.html.

Related Material