Rotation Invariant Householder Parameterization for Bayesian PCA

Rajbir Nirwan, Nils Bertschinger
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4820-4828, 2019.

Abstract

We consider probabilistic PCA and related factor models from a Bayesian perspective. These models are in general not identifiable as the likelihood has a rotational symmetry. This gives rise to complicated posterior distributions with continuous subspaces of equal density and thus hinders efficiency of inference as well as interpretation of obtained parameters. In particular, posterior averages over factor loadings become meaningless and only model predictions are unambiguous. Here, we propose a parameterization based on Householder transformations, which remove the rotational symmetry of the posterior. Furthermore, by relying on results from random matrix theory, we establish the parameter distribution which leaves the model unchanged compared to the original rotationally symmetric formulation. In particular, we avoid the need to compute the Jacobian determinant of the parameter transformation. This allows us to efficiently implement probabilistic PCA in a rotation invariant fashion in any state of the art toolbox. Here, we implemented our model in the probabilistic programming language Stan and illustrate it on several examples.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-nirwan19a, title = {Rotation Invariant Householder Parameterization for {B}ayesian {PCA}}, author = {Nirwan, Rajbir and Bertschinger, Nils}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4820--4828}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/nirwan19a/nirwan19a.pdf}, url = {https://proceedings.mlr.press/v97/nirwan19a.html}, abstract = {We consider probabilistic PCA and related factor models from a Bayesian perspective. These models are in general not identifiable as the likelihood has a rotational symmetry. This gives rise to complicated posterior distributions with continuous subspaces of equal density and thus hinders efficiency of inference as well as interpretation of obtained parameters. In particular, posterior averages over factor loadings become meaningless and only model predictions are unambiguous. Here, we propose a parameterization based on Householder transformations, which remove the rotational symmetry of the posterior. Furthermore, by relying on results from random matrix theory, we establish the parameter distribution which leaves the model unchanged compared to the original rotationally symmetric formulation. In particular, we avoid the need to compute the Jacobian determinant of the parameter transformation. This allows us to efficiently implement probabilistic PCA in a rotation invariant fashion in any state of the art toolbox. Here, we implemented our model in the probabilistic programming language Stan and illustrate it on several examples.} }
Endnote
%0 Conference Paper %T Rotation Invariant Householder Parameterization for Bayesian PCA %A Rajbir Nirwan %A Nils Bertschinger %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-nirwan19a %I PMLR %P 4820--4828 %U https://proceedings.mlr.press/v97/nirwan19a.html %V 97 %X We consider probabilistic PCA and related factor models from a Bayesian perspective. These models are in general not identifiable as the likelihood has a rotational symmetry. This gives rise to complicated posterior distributions with continuous subspaces of equal density and thus hinders efficiency of inference as well as interpretation of obtained parameters. In particular, posterior averages over factor loadings become meaningless and only model predictions are unambiguous. Here, we propose a parameterization based on Householder transformations, which remove the rotational symmetry of the posterior. Furthermore, by relying on results from random matrix theory, we establish the parameter distribution which leaves the model unchanged compared to the original rotationally symmetric formulation. In particular, we avoid the need to compute the Jacobian determinant of the parameter transformation. This allows us to efficiently implement probabilistic PCA in a rotation invariant fashion in any state of the art toolbox. Here, we implemented our model in the probabilistic programming language Stan and illustrate it on several examples.
APA
Nirwan, R. & Bertschinger, N.. (2019). Rotation Invariant Householder Parameterization for Bayesian PCA. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4820-4828 Available from https://proceedings.mlr.press/v97/nirwan19a.html.

Related Material