Conditional Expectation Propagation

Zheng Wang, Shandian Zhe
Proceedings of The 35th Uncertainty in Artificial Intelligence Conference, PMLR 115:28-37, 2020.

Abstract

Expectation propagation (EP) is a powerful approximate inference algorithm. However, a critical barrier in applying EP is that the moment matching in message updates can be intractable. Handcrafting approximations is usually tricky, and lacks generalizability. Importance sampling is very expensive. While Laplace propagation offers an excellent solution, it has to run numerical optimizations to find Laplace approximations in every update, which is still quite inefficient. To overcome these practical barriers, we propose conditional expectation propagation (CEP) that performs conditional moment matching given the variables outside each message fixed, and then takes expectation w.r.t their approximate posterior. The conditional moments are often analytical and much easier to derive. In the most general case, we can use (fully) factorized messages so that the conditional moments can be represented by quadrature formulas. We then compute the expectation of the conditional moments via Taylor approximations when necessary. In this way, our algorithm can always conduct efficient, analytical fixed point iterations. Experiments on several popular models for which standard EP is available or unavailable demonstrate the advantages of CEP in both inference quality and computational efficiency.

Cite this Paper


BibTeX
@InProceedings{pmlr-v115-wang20a, title = {Conditional Expectation Propagation}, author = {Wang, Zheng and Zhe, Shandian}, booktitle = {Proceedings of The 35th Uncertainty in Artificial Intelligence Conference}, pages = {28--37}, year = {2020}, editor = {Adams, Ryan P. and Gogate, Vibhav}, volume = {115}, series = {Proceedings of Machine Learning Research}, month = {22--25 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v115/wang20a/wang20a.pdf}, url = {https://proceedings.mlr.press/v115/wang20a.html}, abstract = {Expectation propagation (EP) is a powerful approximate inference algorithm. However, a critical barrier in applying EP is that the moment matching in message updates can be intractable. Handcrafting approximations is usually tricky, and lacks generalizability. Importance sampling is very expensive. While Laplace propagation offers an excellent solution, it has to run numerical optimizations to find Laplace approximations in every update, which is still quite inefficient. To overcome these practical barriers, we propose conditional expectation propagation (CEP) that performs conditional moment matching given the variables outside each message fixed, and then takes expectation w.r.t their approximate posterior. The conditional moments are often analytical and much easier to derive. In the most general case, we can use (fully) factorized messages so that the conditional moments can be represented by quadrature formulas. We then compute the expectation of the conditional moments via Taylor approximations when necessary. In this way, our algorithm can always conduct efficient, analytical fixed point iterations. Experiments on several popular models for which standard EP is available or unavailable demonstrate the advantages of CEP in both inference quality and computational efficiency.} }
Endnote
%0 Conference Paper %T Conditional Expectation Propagation %A Zheng Wang %A Shandian Zhe %B Proceedings of The 35th Uncertainty in Artificial Intelligence Conference %C Proceedings of Machine Learning Research %D 2020 %E Ryan P. Adams %E Vibhav Gogate %F pmlr-v115-wang20a %I PMLR %P 28--37 %U https://proceedings.mlr.press/v115/wang20a.html %V 115 %X Expectation propagation (EP) is a powerful approximate inference algorithm. However, a critical barrier in applying EP is that the moment matching in message updates can be intractable. Handcrafting approximations is usually tricky, and lacks generalizability. Importance sampling is very expensive. While Laplace propagation offers an excellent solution, it has to run numerical optimizations to find Laplace approximations in every update, which is still quite inefficient. To overcome these practical barriers, we propose conditional expectation propagation (CEP) that performs conditional moment matching given the variables outside each message fixed, and then takes expectation w.r.t their approximate posterior. The conditional moments are often analytical and much easier to derive. In the most general case, we can use (fully) factorized messages so that the conditional moments can be represented by quadrature formulas. We then compute the expectation of the conditional moments via Taylor approximations when necessary. In this way, our algorithm can always conduct efficient, analytical fixed point iterations. Experiments on several popular models for which standard EP is available or unavailable demonstrate the advantages of CEP in both inference quality and computational efficiency.
APA
Wang, Z. & Zhe, S.. (2020). Conditional Expectation Propagation. Proceedings of The 35th Uncertainty in Artificial Intelligence Conference, in Proceedings of Machine Learning Research 115:28-37 Available from https://proceedings.mlr.press/v115/wang20a.html.

Related Material