Deep Gaussian Processes with Importance-Weighted Variational Inference

Hugh Salimbeni, Vincent Dutordoir, James Hensman, Marc Deisenroth
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:5589-5598, 2019.

Abstract

Deep Gaussian processes (DGPs) can model complex marginal densities as well as complex mappings. Non-Gaussian marginals are essential for modelling real-world data, and can be generated from the DGP by incorporating uncorrelated variables to the model. Previous work in the DGP model has introduced noise additively, and used variational inference with a combination of sparse Gaussian processes and mean-field Gaussians for the approximate posterior. Additive noise attenuates the signal, and the Gaussian form of variational distribution may lead to an inaccurate posterior. We instead incorporate noisy variables as latent covariates, and propose a novel importance-weighted objective, which leverages analytic results and provides a mechanism to trade off computation for improved accuracy. Our results demonstrate that the importance-weighted objective works well in practice and consistently outperforms classical variational inference, especially for deeper models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-salimbeni19a, title = {Deep {G}aussian Processes with Importance-Weighted Variational Inference}, author = {Salimbeni, Hugh and Dutordoir, Vincent and Hensman, James and Deisenroth, Marc}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {5589--5598}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/salimbeni19a/salimbeni19a.pdf}, url = {https://proceedings.mlr.press/v97/salimbeni19a.html}, abstract = {Deep Gaussian processes (DGPs) can model complex marginal densities as well as complex mappings. Non-Gaussian marginals are essential for modelling real-world data, and can be generated from the DGP by incorporating uncorrelated variables to the model. Previous work in the DGP model has introduced noise additively, and used variational inference with a combination of sparse Gaussian processes and mean-field Gaussians for the approximate posterior. Additive noise attenuates the signal, and the Gaussian form of variational distribution may lead to an inaccurate posterior. We instead incorporate noisy variables as latent covariates, and propose a novel importance-weighted objective, which leverages analytic results and provides a mechanism to trade off computation for improved accuracy. Our results demonstrate that the importance-weighted objective works well in practice and consistently outperforms classical variational inference, especially for deeper models.} }
Endnote
%0 Conference Paper %T Deep Gaussian Processes with Importance-Weighted Variational Inference %A Hugh Salimbeni %A Vincent Dutordoir %A James Hensman %A Marc Deisenroth %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-salimbeni19a %I PMLR %P 5589--5598 %U https://proceedings.mlr.press/v97/salimbeni19a.html %V 97 %X Deep Gaussian processes (DGPs) can model complex marginal densities as well as complex mappings. Non-Gaussian marginals are essential for modelling real-world data, and can be generated from the DGP by incorporating uncorrelated variables to the model. Previous work in the DGP model has introduced noise additively, and used variational inference with a combination of sparse Gaussian processes and mean-field Gaussians for the approximate posterior. Additive noise attenuates the signal, and the Gaussian form of variational distribution may lead to an inaccurate posterior. We instead incorporate noisy variables as latent covariates, and propose a novel importance-weighted objective, which leverages analytic results and provides a mechanism to trade off computation for improved accuracy. Our results demonstrate that the importance-weighted objective works well in practice and consistently outperforms classical variational inference, especially for deeper models.
APA
Salimbeni, H., Dutordoir, V., Hensman, J. & Deisenroth, M.. (2019). Deep Gaussian Processes with Importance-Weighted Variational Inference. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:5589-5598 Available from https://proceedings.mlr.press/v97/salimbeni19a.html.

Related Material