Latent Variable Models for Dimensionality Reduction

Zhihua Zhang, Michael I. Jordan
Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:655-662, 2009.

Abstract

Principal coordinate analysis (PCO), as a duality of principal component analysis (PCA), is also a classical method for explanatory data analysis. In this paper we propose a probabilistic PCO by using a normal latent variable model in which maximum likelihood estimation and an expectation-maximization algorithm are respectively devised to calculate the configurations of objects in a low-dimensional Euclidean space. We also devise probabilistic formulations for kernel PCA which is a nonlinear extension of PCA.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-zhang09b, title = {Latent Variable Models for Dimensionality Reduction}, author = {Zhang, Zhihua and Jordan, Michael I.}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {655--662}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/zhang09b/zhang09b.pdf}, url = {https://proceedings.mlr.press/v5/zhang09b.html}, abstract = {Principal coordinate analysis (PCO), as a duality of principal component analysis (PCA), is also a classical method for explanatory data analysis. In this paper we propose a probabilistic PCO by using a normal latent variable model in which maximum likelihood estimation and an expectation-maximization algorithm are respectively devised to calculate the configurations of objects in a low-dimensional Euclidean space. We also devise probabilistic formulations for kernel PCA which is a nonlinear extension of PCA.} }
Endnote
%0 Conference Paper %T Latent Variable Models for Dimensionality Reduction %A Zhihua Zhang %A Michael I. Jordan %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-zhang09b %I PMLR %P 655--662 %U https://proceedings.mlr.press/v5/zhang09b.html %V 5 %X Principal coordinate analysis (PCO), as a duality of principal component analysis (PCA), is also a classical method for explanatory data analysis. In this paper we propose a probabilistic PCO by using a normal latent variable model in which maximum likelihood estimation and an expectation-maximization algorithm are respectively devised to calculate the configurations of objects in a low-dimensional Euclidean space. We also devise probabilistic formulations for kernel PCA which is a nonlinear extension of PCA.
RIS
TY - CPAPER TI - Latent Variable Models for Dimensionality Reduction AU - Zhihua Zhang AU - Michael I. Jordan BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-zhang09b PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 655 EP - 662 L1 - http://proceedings.mlr.press/v5/zhang09b/zhang09b.pdf UR - https://proceedings.mlr.press/v5/zhang09b.html AB - Principal coordinate analysis (PCO), as a duality of principal component analysis (PCA), is also a classical method for explanatory data analysis. In this paper we propose a probabilistic PCO by using a normal latent variable model in which maximum likelihood estimation and an expectation-maximization algorithm are respectively devised to calculate the configurations of objects in a low-dimensional Euclidean space. We also devise probabilistic formulations for kernel PCA which is a nonlinear extension of PCA. ER -
APA
Zhang, Z. & Jordan, M.I.. (2009). Latent Variable Models for Dimensionality Reduction. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:655-662 Available from https://proceedings.mlr.press/v5/zhang09b.html.

Related Material