Compositional uncertainty in deep Gaussian processes

Ivan Ustyuzhaninov, Ieva Kazlauskaite, Markus Kaiser, Erik Bodin, Neill Campbell, Carl Henrik Ek
Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), PMLR 124:480-489, 2020.

Abstract

Gaussian processes (GPs) are nonparametric priors over functions. Fitting a GP implies computing a posterior distribution of functions consistent with the observed data. Similarly, deep Gaussian processes (DGPs) should allow us to compute a posterior distribution of compositions of multiple functions giving rise to the observations. However, exact Bayesian inference is intractable for DGPs, motivating the use of various approximations. We show that the application of simplifying mean-field assumptions across the hierarchy leads to the layers of a DGP collapsing to near-deterministic transformations. We argue that such an inference scheme is suboptimal, not taking advantage of the potential of the model to discover the compositional structure in the data. To address this issue, we examine alternative variational inference schemes allowing for dependencies across different layers and discuss their advantages and limitations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v124-ustyuzhaninov20a, title = {Compositional uncertainty in deep Gaussian processes}, author = {Ustyuzhaninov, Ivan and Kazlauskaite, Ieva and Kaiser, Markus and Bodin, Erik and Campbell, Neill and Henrik Ek, Carl}, booktitle = {Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI)}, pages = {480--489}, year = {2020}, editor = {Peters, Jonas and Sontag, David}, volume = {124}, series = {Proceedings of Machine Learning Research}, month = {03--06 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v124/ustyuzhaninov20a/ustyuzhaninov20a.pdf}, url = {https://proceedings.mlr.press/v124/ustyuzhaninov20a.html}, abstract = {Gaussian processes (GPs) are nonparametric priors over functions. Fitting a GP implies computing a posterior distribution of functions consistent with the observed data. Similarly, deep Gaussian processes (DGPs) should allow us to compute a posterior distribution of compositions of multiple functions giving rise to the observations. However, exact Bayesian inference is intractable for DGPs, motivating the use of various approximations. We show that the application of simplifying mean-field assumptions across the hierarchy leads to the layers of a DGP collapsing to near-deterministic transformations. We argue that such an inference scheme is suboptimal, not taking advantage of the potential of the model to discover the compositional structure in the data. To address this issue, we examine alternative variational inference schemes allowing for dependencies across different layers and discuss their advantages and limitations.} }
Endnote
%0 Conference Paper %T Compositional uncertainty in deep Gaussian processes %A Ivan Ustyuzhaninov %A Ieva Kazlauskaite %A Markus Kaiser %A Erik Bodin %A Neill Campbell %A Carl Henrik Ek %B Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI) %C Proceedings of Machine Learning Research %D 2020 %E Jonas Peters %E David Sontag %F pmlr-v124-ustyuzhaninov20a %I PMLR %P 480--489 %U https://proceedings.mlr.press/v124/ustyuzhaninov20a.html %V 124 %X Gaussian processes (GPs) are nonparametric priors over functions. Fitting a GP implies computing a posterior distribution of functions consistent with the observed data. Similarly, deep Gaussian processes (DGPs) should allow us to compute a posterior distribution of compositions of multiple functions giving rise to the observations. However, exact Bayesian inference is intractable for DGPs, motivating the use of various approximations. We show that the application of simplifying mean-field assumptions across the hierarchy leads to the layers of a DGP collapsing to near-deterministic transformations. We argue that such an inference scheme is suboptimal, not taking advantage of the potential of the model to discover the compositional structure in the data. To address this issue, we examine alternative variational inference schemes allowing for dependencies across different layers and discuss their advantages and limitations.
APA
Ustyuzhaninov, I., Kazlauskaite, I., Kaiser, M., Bodin, E., Campbell, N. & Henrik Ek, C.. (2020). Compositional uncertainty in deep Gaussian processes. Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), in Proceedings of Machine Learning Research 124:480-489 Available from https://proceedings.mlr.press/v124/ustyuzhaninov20a.html.

Related Material