GP-ALPS: Automatic Latent Process Selection for Multi-Output Gaussian Process Models

Pavel Berkovich, Eric Perim, Wessel Bruinsma
Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, PMLR 118:1-14, 2020.

Abstract

In this work, we apply Bayesian model selection to the calibration of the complexity of the latent space. We propose an extension of the LMM that automatically chooses the latent processes by turning off those that do not meaningfully contribute to explaining the data. We call the technique Gaussian Process Automatic Latent Process Selection (GPALPS). The extra functionality of GP-ALPS comes at the cost of exact inference, so we devise a variational inference (VI) scheme and demonstrate its suitability in a set of preliminary experiments. We also assess the quality of the variational posterior by comparing our approximate results with those obtained via a Markov Chain Monte Carlo (MCMC) approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v118-berkovich20a, title = {GP-ALPS: Automatic Latent Process Selection for Multi-Output Gaussian Process Models }, author = {Berkovich, Pavel and Perim, Eric and Bruinsma, Wessel}, booktitle = {Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference}, pages = {1--14}, year = {2020}, editor = {Zhang, Cheng and Ruiz, Francisco and Bui, Thang and Dieng, Adji Bousso and Liang, Dawen}, volume = {118}, series = {Proceedings of Machine Learning Research}, month = {08 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v118/berkovich20a/berkovich20a.pdf}, url = {https://proceedings.mlr.press/v118/berkovich20a.html}, abstract = { In this work, we apply Bayesian model selection to the calibration of the complexity of the latent space. We propose an extension of the LMM that automatically chooses the latent processes by turning off those that do not meaningfully contribute to explaining the data. We call the technique Gaussian Process Automatic Latent Process Selection (GPALPS). The extra functionality of GP-ALPS comes at the cost of exact inference, so we devise a variational inference (VI) scheme and demonstrate its suitability in a set of preliminary experiments. We also assess the quality of the variational posterior by comparing our approximate results with those obtained via a Markov Chain Monte Carlo (MCMC) approach.} }
Endnote
%0 Conference Paper %T GP-ALPS: Automatic Latent Process Selection for Multi-Output Gaussian Process Models %A Pavel Berkovich %A Eric Perim %A Wessel Bruinsma %B Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference %C Proceedings of Machine Learning Research %D 2020 %E Cheng Zhang %E Francisco Ruiz %E Thang Bui %E Adji Bousso Dieng %E Dawen Liang %F pmlr-v118-berkovich20a %I PMLR %P 1--14 %U https://proceedings.mlr.press/v118/berkovich20a.html %V 118 %X In this work, we apply Bayesian model selection to the calibration of the complexity of the latent space. We propose an extension of the LMM that automatically chooses the latent processes by turning off those that do not meaningfully contribute to explaining the data. We call the technique Gaussian Process Automatic Latent Process Selection (GPALPS). The extra functionality of GP-ALPS comes at the cost of exact inference, so we devise a variational inference (VI) scheme and demonstrate its suitability in a set of preliminary experiments. We also assess the quality of the variational posterior by comparing our approximate results with those obtained via a Markov Chain Monte Carlo (MCMC) approach.
APA
Berkovich, P., Perim, E. & Bruinsma, W.. (2020). GP-ALPS: Automatic Latent Process Selection for Multi-Output Gaussian Process Models . Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, in Proceedings of Machine Learning Research 118:1-14 Available from https://proceedings.mlr.press/v118/berkovich20a.html.

Related Material