Sequential Inference for Deep Gaussian Process

Yali Wang, Marcus Brubaker, Brahim Chaib-Draa, Raquel Urtasun
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:694-703, 2016.

Abstract

A deep Gaussian process (DGP) is a deep network in which each layer is modelled with a Gaussian process (GP). It is a flexible model that can capture highly-nonlinear functions for complex data sets. However, the network structure of DGP often makes inference computationally expensive. In this paper, we propose an efficient sequential inference framework for DGP, where the data is processed sequentially. We also propose two DGP extensions to handle heteroscedasticity and multi-task learning. Our experimental evaluation shows the effectiveness of our sequential inference framework on a number of important learning tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-wang16c, title = {Sequential Inference for Deep Gaussian Process}, author = {Wang, Yali and Brubaker, Marcus and Chaib-Draa, Brahim and Urtasun, Raquel}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {694--703}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/wang16c.pdf}, url = {https://proceedings.mlr.press/v51/wang16c.html}, abstract = {A deep Gaussian process (DGP) is a deep network in which each layer is modelled with a Gaussian process (GP). It is a flexible model that can capture highly-nonlinear functions for complex data sets. However, the network structure of DGP often makes inference computationally expensive. In this paper, we propose an efficient sequential inference framework for DGP, where the data is processed sequentially. We also propose two DGP extensions to handle heteroscedasticity and multi-task learning. Our experimental evaluation shows the effectiveness of our sequential inference framework on a number of important learning tasks.} }
Endnote
%0 Conference Paper %T Sequential Inference for Deep Gaussian Process %A Yali Wang %A Marcus Brubaker %A Brahim Chaib-Draa %A Raquel Urtasun %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-wang16c %I PMLR %P 694--703 %U https://proceedings.mlr.press/v51/wang16c.html %V 51 %X A deep Gaussian process (DGP) is a deep network in which each layer is modelled with a Gaussian process (GP). It is a flexible model that can capture highly-nonlinear functions for complex data sets. However, the network structure of DGP often makes inference computationally expensive. In this paper, we propose an efficient sequential inference framework for DGP, where the data is processed sequentially. We also propose two DGP extensions to handle heteroscedasticity and multi-task learning. Our experimental evaluation shows the effectiveness of our sequential inference framework on a number of important learning tasks.
RIS
TY - CPAPER TI - Sequential Inference for Deep Gaussian Process AU - Yali Wang AU - Marcus Brubaker AU - Brahim Chaib-Draa AU - Raquel Urtasun BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-wang16c PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 694 EP - 703 L1 - http://proceedings.mlr.press/v51/wang16c.pdf UR - https://proceedings.mlr.press/v51/wang16c.html AB - A deep Gaussian process (DGP) is a deep network in which each layer is modelled with a Gaussian process (GP). It is a flexible model that can capture highly-nonlinear functions for complex data sets. However, the network structure of DGP often makes inference computationally expensive. In this paper, we propose an efficient sequential inference framework for DGP, where the data is processed sequentially. We also propose two DGP extensions to handle heteroscedasticity and multi-task learning. Our experimental evaluation shows the effectiveness of our sequential inference framework on a number of important learning tasks. ER -
APA
Wang, Y., Brubaker, M., Chaib-Draa, B. & Urtasun, R.. (2016). Sequential Inference for Deep Gaussian Process. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:694-703 Available from https://proceedings.mlr.press/v51/wang16c.html.

Related Material