Latent Variable Models for Dimensionality Reduction

Zhihua Zhang, Michael I. Jordan
; Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:655-662, 2009.

Abstract

Principal coordinate analysis (PCO), as a duality of principal component analysis (PCA), is also a classical method for explanatory data analysis. In this paper we propose a probabilistic PCO by using a normal latent variable model in which maximum likelihood estimation and an expectation-maximization algorithm are respectively devised to calculate the configurations of objects in a low-dimensional Euclidean space. We also devise probabilistic formulations for kernel PCA which is a nonlinear extension of PCA.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-zhang09b, title = {Latent Variable Models for Dimensionality Reduction}, author = {Zhihua Zhang and Michael I. Jordan}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {655--662}, year = {2009}, editor = {David van Dyk and Max Welling}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/zhang09b/zhang09b.pdf}, url = {http://proceedings.mlr.press/v5/zhang09b.html}, abstract = {Principal coordinate analysis (PCO), as a duality of principal component analysis (PCA), is also a classical method for explanatory data analysis. In this paper we propose a probabilistic PCO by using a normal latent variable model in which maximum likelihood estimation and an expectation-maximization algorithm are respectively devised to calculate the configurations of objects in a low-dimensional Euclidean space. We also devise probabilistic formulations for kernel PCA which is a nonlinear extension of PCA.} }
Endnote
%0 Conference Paper %T Latent Variable Models for Dimensionality Reduction %A Zhihua Zhang %A Michael I. Jordan %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-zhang09b %I PMLR %J Proceedings of Machine Learning Research %P 655--662 %U http://proceedings.mlr.press %V 5 %W PMLR %X Principal coordinate analysis (PCO), as a duality of principal component analysis (PCA), is also a classical method for explanatory data analysis. In this paper we propose a probabilistic PCO by using a normal latent variable model in which maximum likelihood estimation and an expectation-maximization algorithm are respectively devised to calculate the configurations of objects in a low-dimensional Euclidean space. We also devise probabilistic formulations for kernel PCA which is a nonlinear extension of PCA.
RIS
TY - CPAPER TI - Latent Variable Models for Dimensionality Reduction AU - Zhihua Zhang AU - Michael I. Jordan BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics PY - 2009/04/15 DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-zhang09b PB - PMLR SP - 655 DP - PMLR EP - 662 L1 - http://proceedings.mlr.press/v5/zhang09b/zhang09b.pdf UR - http://proceedings.mlr.press/v5/zhang09b.html AB - Principal coordinate analysis (PCO), as a duality of principal component analysis (PCA), is also a classical method for explanatory data analysis. In this paper we propose a probabilistic PCO by using a normal latent variable model in which maximum likelihood estimation and an expectation-maximization algorithm are respectively devised to calculate the configurations of objects in a low-dimensional Euclidean space. We also devise probabilistic formulations for kernel PCA which is a nonlinear extension of PCA. ER -
APA
Zhang, Z. & Jordan, M.I.. (2009). Latent Variable Models for Dimensionality Reduction. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:655-662

Related Material