DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures

Andrew Lawrence, Carl Henrik Ek, Neill Campbell
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:3682-3691, 2019.

Abstract

We present a non-parametric Bayesian latent variable model capable of learning dependency structures across dimensions in a multivariate setting. Our approach is based on flexible Gaussian process priors for the generative mappings and interchangeable Dirichlet process priors to learn the structure. The introduction of the Dirichlet process as a specific structural prior allows our model to circumvent issues associated with previous Gaussian process latent variable models. Inference is performed by deriving an efficient variational bound on the marginal log-likelihood of the model. We demonstrate the efficacy of our approach via analysis of discovered structure and superior quantitative performance on missing data imputation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-lawrence19a, title = {{DP}-{GP}-{LVM}: A {B}ayesian Non-Parametric Model for Learning Multivariate Dependency Structures}, author = {Lawrence, Andrew and Ek, Carl Henrik and Campbell, Neill}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {3682--3691}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/lawrence19a/lawrence19a.pdf}, url = {https://proceedings.mlr.press/v97/lawrence19a.html}, abstract = {We present a non-parametric Bayesian latent variable model capable of learning dependency structures across dimensions in a multivariate setting. Our approach is based on flexible Gaussian process priors for the generative mappings and interchangeable Dirichlet process priors to learn the structure. The introduction of the Dirichlet process as a specific structural prior allows our model to circumvent issues associated with previous Gaussian process latent variable models. Inference is performed by deriving an efficient variational bound on the marginal log-likelihood of the model. We demonstrate the efficacy of our approach via analysis of discovered structure and superior quantitative performance on missing data imputation.} }
Endnote
%0 Conference Paper %T DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures %A Andrew Lawrence %A Carl Henrik Ek %A Neill Campbell %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-lawrence19a %I PMLR %P 3682--3691 %U https://proceedings.mlr.press/v97/lawrence19a.html %V 97 %X We present a non-parametric Bayesian latent variable model capable of learning dependency structures across dimensions in a multivariate setting. Our approach is based on flexible Gaussian process priors for the generative mappings and interchangeable Dirichlet process priors to learn the structure. The introduction of the Dirichlet process as a specific structural prior allows our model to circumvent issues associated with previous Gaussian process latent variable models. Inference is performed by deriving an efficient variational bound on the marginal log-likelihood of the model. We demonstrate the efficacy of our approach via analysis of discovered structure and superior quantitative performance on missing data imputation.
APA
Lawrence, A., Ek, C.H. & Campbell, N.. (2019). DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:3682-3691 Available from https://proceedings.mlr.press/v97/lawrence19a.html.

Related Material