Expectation Propagation for Likelihoods Depending on an Inner Product of Two Multivariate Random Variables

Tomi Peltola, Pasi Jylänki, Aki Vehtari
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:769-777, 2014.

Abstract

We describe how a deterministic Gaussian posterior approximation can be constructed using expectation propagation (EP) for models, where the likelihood function depends on an inner product of two multivariate random variables. The family of applicable models includes a wide variety of important linear latent variable models used in statistical machine learning, such as principal component and factor analysis, their linear extensions, and errors-in-variables regression. The EP computations are facilitated by an integral transformation of the Dirac delta function, which allows transforming the multidimensional integrals over the two multivariate random variables into an analytically tractable form up to one-dimensional analytically intractable integrals that can be efficiently computed numerically. We study the resulting posterior approximations in sparse principal component analysis with Gaussian and probit likelihoods. Comparisons to Gibbs sampling and variational inference are presented.

Cite this Paper


BibTeX
@InProceedings{pmlr-v33-peltola14, title = {{Expectation Propagation for Likelihoods Depending on an Inner Product of Two Multivariate Random Variables}}, author = {Peltola, Tomi and Jylänki, Pasi and Vehtari, Aki}, booktitle = {Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics}, pages = {769--777}, year = {2014}, editor = {Kaski, Samuel and Corander, Jukka}, volume = {33}, series = {Proceedings of Machine Learning Research}, address = {Reykjavik, Iceland}, month = {22--25 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v33/peltola14.pdf}, url = {https://proceedings.mlr.press/v33/peltola14.html}, abstract = {We describe how a deterministic Gaussian posterior approximation can be constructed using expectation propagation (EP) for models, where the likelihood function depends on an inner product of two multivariate random variables. The family of applicable models includes a wide variety of important linear latent variable models used in statistical machine learning, such as principal component and factor analysis, their linear extensions, and errors-in-variables regression. The EP computations are facilitated by an integral transformation of the Dirac delta function, which allows transforming the multidimensional integrals over the two multivariate random variables into an analytically tractable form up to one-dimensional analytically intractable integrals that can be efficiently computed numerically. We study the resulting posterior approximations in sparse principal component analysis with Gaussian and probit likelihoods. Comparisons to Gibbs sampling and variational inference are presented.} }
Endnote
%0 Conference Paper %T Expectation Propagation for Likelihoods Depending on an Inner Product of Two Multivariate Random Variables %A Tomi Peltola %A Pasi Jylänki %A Aki Vehtari %B Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2014 %E Samuel Kaski %E Jukka Corander %F pmlr-v33-peltola14 %I PMLR %P 769--777 %U https://proceedings.mlr.press/v33/peltola14.html %V 33 %X We describe how a deterministic Gaussian posterior approximation can be constructed using expectation propagation (EP) for models, where the likelihood function depends on an inner product of two multivariate random variables. The family of applicable models includes a wide variety of important linear latent variable models used in statistical machine learning, such as principal component and factor analysis, their linear extensions, and errors-in-variables regression. The EP computations are facilitated by an integral transformation of the Dirac delta function, which allows transforming the multidimensional integrals over the two multivariate random variables into an analytically tractable form up to one-dimensional analytically intractable integrals that can be efficiently computed numerically. We study the resulting posterior approximations in sparse principal component analysis with Gaussian and probit likelihoods. Comparisons to Gibbs sampling and variational inference are presented.
RIS
TY - CPAPER TI - Expectation Propagation for Likelihoods Depending on an Inner Product of Two Multivariate Random Variables AU - Tomi Peltola AU - Pasi Jylänki AU - Aki Vehtari BT - Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics DA - 2014/04/02 ED - Samuel Kaski ED - Jukka Corander ID - pmlr-v33-peltola14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 33 SP - 769 EP - 777 L1 - http://proceedings.mlr.press/v33/peltola14.pdf UR - https://proceedings.mlr.press/v33/peltola14.html AB - We describe how a deterministic Gaussian posterior approximation can be constructed using expectation propagation (EP) for models, where the likelihood function depends on an inner product of two multivariate random variables. The family of applicable models includes a wide variety of important linear latent variable models used in statistical machine learning, such as principal component and factor analysis, their linear extensions, and errors-in-variables regression. The EP computations are facilitated by an integral transformation of the Dirac delta function, which allows transforming the multidimensional integrals over the two multivariate random variables into an analytically tractable form up to one-dimensional analytically intractable integrals that can be efficiently computed numerically. We study the resulting posterior approximations in sparse principal component analysis with Gaussian and probit likelihoods. Comparisons to Gibbs sampling and variational inference are presented. ER -
APA
Peltola, T., Jylänki, P. & Vehtari, A.. (2014). Expectation Propagation for Likelihoods Depending on an Inner Product of Two Multivariate Random Variables. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 33:769-777 Available from https://proceedings.mlr.press/v33/peltola14.html.

Related Material