Chained Gaussian Processes

Alan D. Saul, James Hensman, Aki Vehtari, Neil D. Lawrence
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:1431-1440, 2016.

Abstract

Gaussian process models are flexible, Bayesian non-parametric approaches to regression. Properties of multivariate Gaussians mean that they can be combined linearly in the manner of additive models and via a link function (like in generalized linear models) to handle non-Gaussian data. However, the link function formalism is restrictive, link functions are always invertible and must convert a parameter of interest to an linear combination of the underlying processes. There are many likelihoods and models where a non-linear combination is more appropriate. We term these more general models "Chained Gaussian Processes": the transformation of the GPs to the likelihood parameters will not generally be invertible, and that implies that linearisation would only be possible with multiple (localized) links, i.e a chain. We develop an approximate inference procedure for Chained GPs that is scalable and applicable to any factorized likelihood. We demonstrate the approximation on a range of likelihood functions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-saul16, title = {Chained Gaussian Processes}, author = {Saul, Alan D. and Hensman, James and Vehtari, Aki and Lawrence, Neil D.}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {1431--1440}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/saul16.pdf}, url = {https://proceedings.mlr.press/v51/saul16.html}, abstract = {Gaussian process models are flexible, Bayesian non-parametric approaches to regression. Properties of multivariate Gaussians mean that they can be combined linearly in the manner of additive models and via a link function (like in generalized linear models) to handle non-Gaussian data. However, the link function formalism is restrictive, link functions are always invertible and must convert a parameter of interest to an linear combination of the underlying processes. There are many likelihoods and models where a non-linear combination is more appropriate. We term these more general models "Chained Gaussian Processes": the transformation of the GPs to the likelihood parameters will not generally be invertible, and that implies that linearisation would only be possible with multiple (localized) links, i.e a chain. We develop an approximate inference procedure for Chained GPs that is scalable and applicable to any factorized likelihood. We demonstrate the approximation on a range of likelihood functions.} }
Endnote
%0 Conference Paper %T Chained Gaussian Processes %A Alan D. Saul %A James Hensman %A Aki Vehtari %A Neil D. Lawrence %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-saul16 %I PMLR %P 1431--1440 %U https://proceedings.mlr.press/v51/saul16.html %V 51 %X Gaussian process models are flexible, Bayesian non-parametric approaches to regression. Properties of multivariate Gaussians mean that they can be combined linearly in the manner of additive models and via a link function (like in generalized linear models) to handle non-Gaussian data. However, the link function formalism is restrictive, link functions are always invertible and must convert a parameter of interest to an linear combination of the underlying processes. There are many likelihoods and models where a non-linear combination is more appropriate. We term these more general models "Chained Gaussian Processes": the transformation of the GPs to the likelihood parameters will not generally be invertible, and that implies that linearisation would only be possible with multiple (localized) links, i.e a chain. We develop an approximate inference procedure for Chained GPs that is scalable and applicable to any factorized likelihood. We demonstrate the approximation on a range of likelihood functions.
RIS
TY - CPAPER TI - Chained Gaussian Processes AU - Alan D. Saul AU - James Hensman AU - Aki Vehtari AU - Neil D. Lawrence BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-saul16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 1431 EP - 1440 L1 - http://proceedings.mlr.press/v51/saul16.pdf UR - https://proceedings.mlr.press/v51/saul16.html AB - Gaussian process models are flexible, Bayesian non-parametric approaches to regression. Properties of multivariate Gaussians mean that they can be combined linearly in the manner of additive models and via a link function (like in generalized linear models) to handle non-Gaussian data. However, the link function formalism is restrictive, link functions are always invertible and must convert a parameter of interest to an linear combination of the underlying processes. There are many likelihoods and models where a non-linear combination is more appropriate. We term these more general models "Chained Gaussian Processes": the transformation of the GPs to the likelihood parameters will not generally be invertible, and that implies that linearisation would only be possible with multiple (localized) links, i.e a chain. We develop an approximate inference procedure for Chained GPs that is scalable and applicable to any factorized likelihood. We demonstrate the approximation on a range of likelihood functions. ER -
APA
Saul, A.D., Hensman, J., Vehtari, A. & Lawrence, N.D.. (2016). Chained Gaussian Processes. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:1431-1440 Available from https://proceedings.mlr.press/v51/saul16.html.

Related Material