Entropic Inequality Constraints from e-separation Relations in Directed Acyclic Graphs with Hidden Variables

Noam Finkelstein, Beata Zjawin, Elie Wolfe, Ilya Shpitser, Robert W. Spekkens
Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, PMLR 161:1045-1055, 2021.

Abstract

Directed acyclic graphs (DAGs) with hidde variables are often used to characterize causal relations between variables in a system. When some variables are unobserved, DAGs imply a notoriously complicated set of constraints on the distribution of observed variables. In this work, we present entropic inequality constraints that are implied by e-separation relations in hidden variable DAGs with discrete observed variables. The constraints can intuitively be understood to follow from the fact that the capacity of variables along a causal pathway to convey information is restricted by their entropy; e.g. at the extreme case, a variable with entropy 0 can convey no information. We show how these constraints can be used to learn about the true causal model from an observed data distribution. In addition, we propose a measure of causal influence called the minimal mediary entropy, and demonstrate that it can concisely augment traditional measures such as the average treatment effect.

Cite this Paper


BibTeX
@InProceedings{pmlr-v161-finkelstein21a, title = {Entropic Inequality Constraints from e-separation Relations in Directed Acyclic Graphs with Hidden Variables}, author = {Finkelstein, Noam and Zjawin, Beata and Wolfe, Elie and Shpitser, Ilya and Spekkens, Robert W.}, booktitle = {Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence}, pages = {1045--1055}, year = {2021}, editor = {de Campos, Cassio and Maathuis, Marloes H.}, volume = {161}, series = {Proceedings of Machine Learning Research}, month = {27--30 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v161/finkelstein21a/finkelstein21a.pdf}, url = {https://proceedings.mlr.press/v161/finkelstein21a.html}, abstract = {Directed acyclic graphs (DAGs) with hidde variables are often used to characterize causal relations between variables in a system. When some variables are unobserved, DAGs imply a notoriously complicated set of constraints on the distribution of observed variables. In this work, we present entropic inequality constraints that are implied by e-separation relations in hidden variable DAGs with discrete observed variables. The constraints can intuitively be understood to follow from the fact that the capacity of variables along a causal pathway to convey information is restricted by their entropy; e.g. at the extreme case, a variable with entropy 0 can convey no information. We show how these constraints can be used to learn about the true causal model from an observed data distribution. In addition, we propose a measure of causal influence called the minimal mediary entropy, and demonstrate that it can concisely augment traditional measures such as the average treatment effect.} }
Endnote
%0 Conference Paper %T Entropic Inequality Constraints from e-separation Relations in Directed Acyclic Graphs with Hidden Variables %A Noam Finkelstein %A Beata Zjawin %A Elie Wolfe %A Ilya Shpitser %A Robert W. Spekkens %B Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2021 %E Cassio de Campos %E Marloes H. Maathuis %F pmlr-v161-finkelstein21a %I PMLR %P 1045--1055 %U https://proceedings.mlr.press/v161/finkelstein21a.html %V 161 %X Directed acyclic graphs (DAGs) with hidde variables are often used to characterize causal relations between variables in a system. When some variables are unobserved, DAGs imply a notoriously complicated set of constraints on the distribution of observed variables. In this work, we present entropic inequality constraints that are implied by e-separation relations in hidden variable DAGs with discrete observed variables. The constraints can intuitively be understood to follow from the fact that the capacity of variables along a causal pathway to convey information is restricted by their entropy; e.g. at the extreme case, a variable with entropy 0 can convey no information. We show how these constraints can be used to learn about the true causal model from an observed data distribution. In addition, we propose a measure of causal influence called the minimal mediary entropy, and demonstrate that it can concisely augment traditional measures such as the average treatment effect.
APA
Finkelstein, N., Zjawin, B., Wolfe, E., Shpitser, I. & Spekkens, R.W.. (2021). Entropic Inequality Constraints from e-separation Relations in Directed Acyclic Graphs with Hidden Variables. Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 161:1045-1055 Available from https://proceedings.mlr.press/v161/finkelstein21a.html.

Related Material