Learning for Larger Datasets with the Gaussian Process Latent Variable Model

Neil D. Lawrence
Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:243-250, 2007.

Abstract

In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GPLVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v2-lawrence07a, title = {Learning for Larger Datasets with the Gaussian Process Latent Variable Model}, author = {Lawrence, Neil D.}, booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics}, pages = {243--250}, year = {2007}, editor = {Meila, Marina and Shen, Xiaotong}, volume = {2}, series = {Proceedings of Machine Learning Research}, address = {San Juan, Puerto Rico}, month = {21--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v2/lawrence07a/lawrence07a.pdf}, url = {https://proceedings.mlr.press/v2/lawrence07a.html}, abstract = {In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GPLVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model.} }
Endnote
%0 Conference Paper %T Learning for Larger Datasets with the Gaussian Process Latent Variable Model %A Neil D. Lawrence %B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2007 %E Marina Meila %E Xiaotong Shen %F pmlr-v2-lawrence07a %I PMLR %P 243--250 %U https://proceedings.mlr.press/v2/lawrence07a.html %V 2 %X In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GPLVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model.
RIS
TY - CPAPER TI - Learning for Larger Datasets with the Gaussian Process Latent Variable Model AU - Neil D. Lawrence BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - pmlr-v2-lawrence07a PB - PMLR DP - Proceedings of Machine Learning Research VL - 2 SP - 243 EP - 250 L1 - http://proceedings.mlr.press/v2/lawrence07a/lawrence07a.pdf UR - https://proceedings.mlr.press/v2/lawrence07a.html AB - In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GPLVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model. ER -
APA
Lawrence, N.D.. (2007). Learning for Larger Datasets with the Gaussian Process Latent Variable Model. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 2:243-250 Available from https://proceedings.mlr.press/v2/lawrence07a.html.

Related Material