Domain-Liftability of Relational Marginal Polytopes

Ondrej Kuzelka, Yuyi Wang
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:2284-2292, 2020.

Abstract

We study computational aspects of "relational marginal polytopes" which are statistical relational learning counterparts of marginal polytopes, well-known from probabilistic graphical models. Here, given some first-order logic formula, we can define its relational marginal statistic to be the fraction of groundings that make this formula true in a given possible world. For a list of first-order logic formulas, the relational marginal polytope is the set of all points that correspond to expected values of the relational marginal statistics that are realizable. In this paper we study the following two problems: (i) Do domain-liftability results for the partition functions of Markov logic networks (MLNs)carry over to the problem of relational marginal polytope construction? (ii) Is the relational marginal polytope containment problem hard under some plausible complexity-theoretic assumptions? Our positive results have consequences for lifted weight learning of MLNs. In particular, we show that weight learning of MLNs is domain-liftable whenever the computation of the partition function of the respective MLNs is domain-liftable (this result has not been rigorously proven before).

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-kuzelka20a, title = {Domain-Liftability of Relational Marginal Polytopes}, author = {Kuzelka, Ondrej and Wang, Yuyi}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {2284--2292}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/kuzelka20a/kuzelka20a.pdf}, url = {https://proceedings.mlr.press/v108/kuzelka20a.html}, abstract = {We study computational aspects of "relational marginal polytopes" which are statistical relational learning counterparts of marginal polytopes, well-known from probabilistic graphical models. Here, given some first-order logic formula, we can define its relational marginal statistic to be the fraction of groundings that make this formula true in a given possible world. For a list of first-order logic formulas, the relational marginal polytope is the set of all points that correspond to expected values of the relational marginal statistics that are realizable. In this paper we study the following two problems: (i) Do domain-liftability results for the partition functions of Markov logic networks (MLNs)carry over to the problem of relational marginal polytope construction? (ii) Is the relational marginal polytope containment problem hard under some plausible complexity-theoretic assumptions? Our positive results have consequences for lifted weight learning of MLNs. In particular, we show that weight learning of MLNs is domain-liftable whenever the computation of the partition function of the respective MLNs is domain-liftable (this result has not been rigorously proven before).} }
Endnote
%0 Conference Paper %T Domain-Liftability of Relational Marginal Polytopes %A Ondrej Kuzelka %A Yuyi Wang %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-kuzelka20a %I PMLR %P 2284--2292 %U https://proceedings.mlr.press/v108/kuzelka20a.html %V 108 %X We study computational aspects of "relational marginal polytopes" which are statistical relational learning counterparts of marginal polytopes, well-known from probabilistic graphical models. Here, given some first-order logic formula, we can define its relational marginal statistic to be the fraction of groundings that make this formula true in a given possible world. For a list of first-order logic formulas, the relational marginal polytope is the set of all points that correspond to expected values of the relational marginal statistics that are realizable. In this paper we study the following two problems: (i) Do domain-liftability results for the partition functions of Markov logic networks (MLNs)carry over to the problem of relational marginal polytope construction? (ii) Is the relational marginal polytope containment problem hard under some plausible complexity-theoretic assumptions? Our positive results have consequences for lifted weight learning of MLNs. In particular, we show that weight learning of MLNs is domain-liftable whenever the computation of the partition function of the respective MLNs is domain-liftable (this result has not been rigorously proven before).
APA
Kuzelka, O. & Wang, Y.. (2020). Domain-Liftability of Relational Marginal Polytopes. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:2284-2292 Available from https://proceedings.mlr.press/v108/kuzelka20a.html.

Related Material