Improving posterior marginal approximations in latent Gaussian models

Botond Cseke, Tom Heskes
; Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, JMLR Workshop and Conference Proceedings 9:121-128, 2010.

Abstract

We consider the problem of correcting the posterior marginal approximations computed by expectation propagation and Laplace approximation in latent Gaussian models and propose correction methods that are similar in spirit to the Laplace approximation of Tierney and Kadane (1986). We show that in the case of sparse Gaussian models, the computational complexity of expectation propagation can be made comparable to that of the Laplace approximation by using a parallel updating scheme. In some cases, expectation propagation gives excellent estimates, where the Laplace approximation fails. Inspired by bounds on the marginal corrections, we arrive at factorized approximations, which can be applied on top of both expectation propagation and Laplace. These give nearly indistinguishable results from the non-factorized approximations in a fraction of the time.

Cite this Paper


BibTeX
@InProceedings{pmlr-v9-cseke10a, title = {Improving posterior marginal approximations in latent Gaussian models}, author = {Botond Cseke and Tom Heskes}, booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics}, pages = {121--128}, year = {2010}, editor = {Yee Whye Teh and Mike Titterington}, volume = {9}, series = {Proceedings of Machine Learning Research}, address = {Chia Laguna Resort, Sardinia, Italy}, month = {13--15 May}, publisher = {JMLR Workshop and Conference Proceedings}, pdf = {http://proceedings.mlr.press/v9/cseke10a/cseke10a.pdf}, url = {http://proceedings.mlr.press/v9/cseke10a.html}, abstract = {We consider the problem of correcting the posterior marginal approximations computed by expectation propagation and Laplace approximation in latent Gaussian models and propose correction methods that are similar in spirit to the Laplace approximation of Tierney and Kadane (1986). We show that in the case of sparse Gaussian models, the computational complexity of expectation propagation can be made comparable to that of the Laplace approximation by using a parallel updating scheme. In some cases, expectation propagation gives excellent estimates, where the Laplace approximation fails. Inspired by bounds on the marginal corrections, we arrive at factorized approximations, which can be applied on top of both expectation propagation and Laplace. These give nearly indistinguishable results from the non-factorized approximations in a fraction of the time.} }
Endnote
%0 Conference Paper %T Improving posterior marginal approximations in latent Gaussian models %A Botond Cseke %A Tom Heskes %B Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2010 %E Yee Whye Teh %E Mike Titterington %F pmlr-v9-cseke10a %I PMLR %J Proceedings of Machine Learning Research %P 121--128 %U http://proceedings.mlr.press %V 9 %W PMLR %X We consider the problem of correcting the posterior marginal approximations computed by expectation propagation and Laplace approximation in latent Gaussian models and propose correction methods that are similar in spirit to the Laplace approximation of Tierney and Kadane (1986). We show that in the case of sparse Gaussian models, the computational complexity of expectation propagation can be made comparable to that of the Laplace approximation by using a parallel updating scheme. In some cases, expectation propagation gives excellent estimates, where the Laplace approximation fails. Inspired by bounds on the marginal corrections, we arrive at factorized approximations, which can be applied on top of both expectation propagation and Laplace. These give nearly indistinguishable results from the non-factorized approximations in a fraction of the time.
RIS
TY - CPAPER TI - Improving posterior marginal approximations in latent Gaussian models AU - Botond Cseke AU - Tom Heskes BT - Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics PY - 2010/03/31 DA - 2010/03/31 ED - Yee Whye Teh ED - Mike Titterington ID - pmlr-v9-cseke10a PB - PMLR SP - 121 DP - PMLR EP - 128 L1 - http://proceedings.mlr.press/v9/cseke10a/cseke10a.pdf UR - http://proceedings.mlr.press/v9/cseke10a.html AB - We consider the problem of correcting the posterior marginal approximations computed by expectation propagation and Laplace approximation in latent Gaussian models and propose correction methods that are similar in spirit to the Laplace approximation of Tierney and Kadane (1986). We show that in the case of sparse Gaussian models, the computational complexity of expectation propagation can be made comparable to that of the Laplace approximation by using a parallel updating scheme. In some cases, expectation propagation gives excellent estimates, where the Laplace approximation fails. Inspired by bounds on the marginal corrections, we arrive at factorized approximations, which can be applied on top of both expectation propagation and Laplace. These give nearly indistinguishable results from the non-factorized approximations in a fraction of the time. ER -
APA
Cseke, B. & Heskes, T.. (2010). Improving posterior marginal approximations in latent Gaussian models. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, in PMLR 9:121-128

Related Material