Expectation Propagation for Likelihoods Depending on an Inner Product of Two Multivariate Random Variables

[edit]

Tomi Peltola, Pasi Jylänki, Aki Vehtari ;
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:769-777, 2014.

Abstract

We describe how a deterministic Gaussian posterior approximation can be constructed using expectation propagation (EP) for models, where the likelihood function depends on an inner product of two multivariate random variables. The family of applicable models includes a wide variety of important linear latent variable models used in statistical machine learning, such as principal component and factor analysis, their linear extensions, and errors-in-variables regression. The EP computations are facilitated by an integral transformation of the Dirac delta function, which allows transforming the multidimensional integrals over the two multivariate random variables into an analytically tractable form up to one-dimensional analytically intractable integrals that can be efficiently computed numerically. We study the resulting posterior approximations in sparse principal component analysis with Gaussian and probit likelihoods. Comparisons to Gibbs sampling and variational inference are presented.

Related Material