Approximate inference using conditional entropy decompositions

Amir Globerson, Tommi Jaakkola
Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:131-138, 2007.

Abstract

We introduce a novel method for estimating the partition function and marginals of distributions defined using graphical models. The method uses the entropy chain rule to obtain an upper bound on the entropy of a distribution given marginal distributions of variable subsets. The structure of the bound is determined by a permutation, or elimination order, of the model variables. Optimizing this bound results in an upper bound on the log partition function, and also yields an approximation to the model marginals. The optimization problem is convex, and is in fact a dual of a geometric program. We evaluate the method on a 2D Ising model with a wide range of parameters, and show that it compares favorably with previous methods in terms of both partition function bound, and accuracy of marginals.

Cite this Paper


BibTeX
@InProceedings{pmlr-v2-globerson07a, title = {Approximate inference using conditional entropy decompositions}, author = {Globerson, Amir and Jaakkola, Tommi}, booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics}, pages = {131--138}, year = {2007}, editor = {Meila, Marina and Shen, Xiaotong}, volume = {2}, series = {Proceedings of Machine Learning Research}, address = {San Juan, Puerto Rico}, month = {21--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v2/globerson07a/globerson07a.pdf}, url = {https://proceedings.mlr.press/v2/globerson07a.html}, abstract = {We introduce a novel method for estimating the partition function and marginals of distributions defined using graphical models. The method uses the entropy chain rule to obtain an upper bound on the entropy of a distribution given marginal distributions of variable subsets. The structure of the bound is determined by a permutation, or elimination order, of the model variables. Optimizing this bound results in an upper bound on the log partition function, and also yields an approximation to the model marginals. The optimization problem is convex, and is in fact a dual of a geometric program. We evaluate the method on a 2D Ising model with a wide range of parameters, and show that it compares favorably with previous methods in terms of both partition function bound, and accuracy of marginals.} }
Endnote
%0 Conference Paper %T Approximate inference using conditional entropy decompositions %A Amir Globerson %A Tommi Jaakkola %B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2007 %E Marina Meila %E Xiaotong Shen %F pmlr-v2-globerson07a %I PMLR %P 131--138 %U https://proceedings.mlr.press/v2/globerson07a.html %V 2 %X We introduce a novel method for estimating the partition function and marginals of distributions defined using graphical models. The method uses the entropy chain rule to obtain an upper bound on the entropy of a distribution given marginal distributions of variable subsets. The structure of the bound is determined by a permutation, or elimination order, of the model variables. Optimizing this bound results in an upper bound on the log partition function, and also yields an approximation to the model marginals. The optimization problem is convex, and is in fact a dual of a geometric program. We evaluate the method on a 2D Ising model with a wide range of parameters, and show that it compares favorably with previous methods in terms of both partition function bound, and accuracy of marginals.
RIS
TY - CPAPER TI - Approximate inference using conditional entropy decompositions AU - Amir Globerson AU - Tommi Jaakkola BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - pmlr-v2-globerson07a PB - PMLR DP - Proceedings of Machine Learning Research VL - 2 SP - 131 EP - 138 L1 - http://proceedings.mlr.press/v2/globerson07a/globerson07a.pdf UR - https://proceedings.mlr.press/v2/globerson07a.html AB - We introduce a novel method for estimating the partition function and marginals of distributions defined using graphical models. The method uses the entropy chain rule to obtain an upper bound on the entropy of a distribution given marginal distributions of variable subsets. The structure of the bound is determined by a permutation, or elimination order, of the model variables. Optimizing this bound results in an upper bound on the log partition function, and also yields an approximation to the model marginals. The optimization problem is convex, and is in fact a dual of a geometric program. We evaluate the method on a 2D Ising model with a wide range of parameters, and show that it compares favorably with previous methods in terms of both partition function bound, and accuracy of marginals. ER -
APA
Globerson, A. & Jaakkola, T.. (2007). Approximate inference using conditional entropy decompositions. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 2:131-138 Available from https://proceedings.mlr.press/v2/globerson07a.html.

Related Material