Causal Inference Despite Limited Global Confounding via Mixture Models

Spencer L. Gordon, Bijan Mazaheri, Yuval Rabani, Leonard Schulman
Proceedings of the Second Conference on Causal Learning and Reasoning, PMLR 213:574-601, 2023.

Abstract

A Bayesian Network is a directed acyclic graph (DAG) on a set of n random variables (the vertices); a Bayesian Network Distribution (BND) is a probability distribution on the random variables that is Markovian on the graph. A finite k-mixture of such models is graphically represented by a larger graph which has an additional “hidden” (or “latent”) random variable U, ranging in {1,,k}, and a directed edge from U to every other vertex. Models of this type are fundamental to causal inference, where U models an unobserved confounding effect of multiple populations, obscuring the causal relationships in the observable DAG. By solving the mixture problem and recovering the joint probability distribution with U, traditionally unidentifiable causal relationships become identifiable. Using a reduction to the more well-studied “product” case on empty graphs, we give the first algorithm to learn mixtures of non-empty DAGs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v213-gordon23a, title = {Causal Inference Despite Limited Global Confounding via Mixture Models}, author = {Gordon, Spencer L. and Mazaheri, Bijan and Rabani, Yuval and Schulman, Leonard}, booktitle = {Proceedings of the Second Conference on Causal Learning and Reasoning}, pages = {574--601}, year = {2023}, editor = {van der Schaar, Mihaela and Zhang, Cheng and Janzing, Dominik}, volume = {213}, series = {Proceedings of Machine Learning Research}, month = {11--14 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v213/gordon23a/gordon23a.pdf}, url = {https://proceedings.mlr.press/v213/gordon23a.html}, abstract = {A Bayesian Network is a directed acyclic graph (DAG) on a set of $n$ random variables (the vertices); a Bayesian Network Distribution (BND) is a probability distribution on the random variables that is Markovian on the graph. A finite $k$-mixture of such models is graphically represented by a larger graph which has an additional “hidden” (or “latent”) random variable $U$, ranging in $\{1,\ldots,k\}$, and a directed edge from $U$ to every other vertex. Models of this type are fundamental to causal inference, where $U$ models an unobserved confounding effect of multiple populations, obscuring the causal relationships in the observable DAG. By solving the mixture problem and recovering the joint probability distribution with $U$, traditionally unidentifiable causal relationships become identifiable. Using a reduction to the more well-studied “product” case on empty graphs, we give the first algorithm to learn mixtures of non-empty DAGs. } }
Endnote
%0 Conference Paper %T Causal Inference Despite Limited Global Confounding via Mixture Models %A Spencer L. Gordon %A Bijan Mazaheri %A Yuval Rabani %A Leonard Schulman %B Proceedings of the Second Conference on Causal Learning and Reasoning %C Proceedings of Machine Learning Research %D 2023 %E Mihaela van der Schaar %E Cheng Zhang %E Dominik Janzing %F pmlr-v213-gordon23a %I PMLR %P 574--601 %U https://proceedings.mlr.press/v213/gordon23a.html %V 213 %X A Bayesian Network is a directed acyclic graph (DAG) on a set of $n$ random variables (the vertices); a Bayesian Network Distribution (BND) is a probability distribution on the random variables that is Markovian on the graph. A finite $k$-mixture of such models is graphically represented by a larger graph which has an additional “hidden” (or “latent”) random variable $U$, ranging in $\{1,\ldots,k\}$, and a directed edge from $U$ to every other vertex. Models of this type are fundamental to causal inference, where $U$ models an unobserved confounding effect of multiple populations, obscuring the causal relationships in the observable DAG. By solving the mixture problem and recovering the joint probability distribution with $U$, traditionally unidentifiable causal relationships become identifiable. Using a reduction to the more well-studied “product” case on empty graphs, we give the first algorithm to learn mixtures of non-empty DAGs.
APA
Gordon, S.L., Mazaheri, B., Rabani, Y. & Schulman, L.. (2023). Causal Inference Despite Limited Global Confounding via Mixture Models. Proceedings of the Second Conference on Causal Learning and Reasoning, in Proceedings of Machine Learning Research 213:574-601 Available from https://proceedings.mlr.press/v213/gordon23a.html.

Related Material