Bayesian Gaussian Process Latent Variable Model

Michalis Titsias, Neil D. Lawrence
; Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, JMLR Workshop and Conference Proceedings 9:844-851, 2010.

Abstract

We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. This method allows us to variationally integrate out the input variables of the Gaussian process and compute a lower bound on the exact marginal likelihood of the nonlinear latent variable model. The maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space. We demonstrate our method on real world datasets. The focus in this paper is on dimensionality reduction problems, but the methodology is more general. For example, our algorithm is immediately applicable for training Gaussian process models in the presence of missing or uncertain inputs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v9-titsias10a, title = {Bayesian Gaussian Process Latent Variable Model}, author = {Michalis Titsias and Neil D. Lawrence}, booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics}, pages = {844--851}, year = {2010}, editor = {Yee Whye Teh and Mike Titterington}, volume = {9}, series = {Proceedings of Machine Learning Research}, address = {Chia Laguna Resort, Sardinia, Italy}, month = {13--15 May}, publisher = {JMLR Workshop and Conference Proceedings}, pdf = {http://proceedings.mlr.press/v9/titsias10a/titsias10a.pdf}, url = {http://proceedings.mlr.press/v9/titsias10a.html}, abstract = {We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. This method allows us to variationally integrate out the input variables of the Gaussian process and compute a lower bound on the exact marginal likelihood of the nonlinear latent variable model. The maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space. We demonstrate our method on real world datasets. The focus in this paper is on dimensionality reduction problems, but the methodology is more general. For example, our algorithm is immediately applicable for training Gaussian process models in the presence of missing or uncertain inputs.} }
Endnote
%0 Conference Paper %T Bayesian Gaussian Process Latent Variable Model %A Michalis Titsias %A Neil D. Lawrence %B Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2010 %E Yee Whye Teh %E Mike Titterington %F pmlr-v9-titsias10a %I PMLR %J Proceedings of Machine Learning Research %P 844--851 %U http://proceedings.mlr.press %V 9 %W PMLR %X We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. This method allows us to variationally integrate out the input variables of the Gaussian process and compute a lower bound on the exact marginal likelihood of the nonlinear latent variable model. The maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space. We demonstrate our method on real world datasets. The focus in this paper is on dimensionality reduction problems, but the methodology is more general. For example, our algorithm is immediately applicable for training Gaussian process models in the presence of missing or uncertain inputs.
RIS
TY - CPAPER TI - Bayesian Gaussian Process Latent Variable Model AU - Michalis Titsias AU - Neil D. Lawrence BT - Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics PY - 2010/03/31 DA - 2010/03/31 ED - Yee Whye Teh ED - Mike Titterington ID - pmlr-v9-titsias10a PB - PMLR SP - 844 DP - PMLR EP - 851 L1 - http://proceedings.mlr.press/v9/titsias10a/titsias10a.pdf UR - http://proceedings.mlr.press/v9/titsias10a.html AB - We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. This method allows us to variationally integrate out the input variables of the Gaussian process and compute a lower bound on the exact marginal likelihood of the nonlinear latent variable model. The maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space. We demonstrate our method on real world datasets. The focus in this paper is on dimensionality reduction problems, but the methodology is more general. For example, our algorithm is immediately applicable for training Gaussian process models in the presence of missing or uncertain inputs. ER -
APA
Titsias, M. & Lawrence, N.D.. (2010). Bayesian Gaussian Process Latent Variable Model. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, in PMLR 9:844-851

Related Material