Approximate inference using conditional entropy decompositions

Amir Globerson, Tommi Jaakkola
; Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:131-138, 2007.

Abstract

We introduce a novel method for estimating the partition function and marginals of distributions defined using graphical models. The method uses the entropy chain rule to obtain an upper bound on the entropy of a distribution given marginal distributions of variable subsets. The structure of the bound is determined by a permutation, or elimination order, of the model variables. Optimizing this bound results in an upper bound on the log partition function, and also yields an approximation to the model marginals. The optimization problem is convex, and is in fact a dual of a geometric program. We evaluate the method on a 2D Ising model with a wide range of parameters, and show that it compares favorably with previous methods in terms of both partition function bound, and accuracy of marginals.

Cite this Paper


BibTeX
@InProceedings{pmlr-v2-globerson07a, title = {Approximate inference using conditional entropy decompositions}, author = {Amir Globerson and Tommi Jaakkola}, booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics}, pages = {131--138}, year = {2007}, editor = {Marina Meila and Xiaotong Shen}, volume = {2}, series = {Proceedings of Machine Learning Research}, address = {San Juan, Puerto Rico}, month = {21--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v2/globerson07a/globerson07a.pdf}, url = {http://proceedings.mlr.press/v2/globerson07a.html}, abstract = {We introduce a novel method for estimating the partition function and marginals of distributions defined using graphical models. The method uses the entropy chain rule to obtain an upper bound on the entropy of a distribution given marginal distributions of variable subsets. The structure of the bound is determined by a permutation, or elimination order, of the model variables. Optimizing this bound results in an upper bound on the log partition function, and also yields an approximation to the model marginals. The optimization problem is convex, and is in fact a dual of a geometric program. We evaluate the method on a 2D Ising model with a wide range of parameters, and show that it compares favorably with previous methods in terms of both partition function bound, and accuracy of marginals.} }
Endnote
%0 Conference Paper %T Approximate inference using conditional entropy decompositions %A Amir Globerson %A Tommi Jaakkola %B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2007 %E Marina Meila %E Xiaotong Shen %F pmlr-v2-globerson07a %I PMLR %J Proceedings of Machine Learning Research %P 131--138 %U http://proceedings.mlr.press %V 2 %W PMLR %X We introduce a novel method for estimating the partition function and marginals of distributions defined using graphical models. The method uses the entropy chain rule to obtain an upper bound on the entropy of a distribution given marginal distributions of variable subsets. The structure of the bound is determined by a permutation, or elimination order, of the model variables. Optimizing this bound results in an upper bound on the log partition function, and also yields an approximation to the model marginals. The optimization problem is convex, and is in fact a dual of a geometric program. We evaluate the method on a 2D Ising model with a wide range of parameters, and show that it compares favorably with previous methods in terms of both partition function bound, and accuracy of marginals.
RIS
TY - CPAPER TI - Approximate inference using conditional entropy decompositions AU - Amir Globerson AU - Tommi Jaakkola BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics PY - 2007/03/11 DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - pmlr-v2-globerson07a PB - PMLR SP - 131 DP - PMLR EP - 138 L1 - http://proceedings.mlr.press/v2/globerson07a/globerson07a.pdf UR - http://proceedings.mlr.press/v2/globerson07a.html AB - We introduce a novel method for estimating the partition function and marginals of distributions defined using graphical models. The method uses the entropy chain rule to obtain an upper bound on the entropy of a distribution given marginal distributions of variable subsets. The structure of the bound is determined by a permutation, or elimination order, of the model variables. Optimizing this bound results in an upper bound on the log partition function, and also yields an approximation to the model marginals. The optimization problem is convex, and is in fact a dual of a geometric program. We evaluate the method on a 2D Ising model with a wide range of parameters, and show that it compares favorably with previous methods in terms of both partition function bound, and accuracy of marginals. ER -
APA
Globerson, A. & Jaakkola, T.. (2007). Approximate inference using conditional entropy decompositions. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in PMLR 2:131-138

Related Material