[edit]
Identifying Confounding from Causal Mechanism Shifts
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:4897-4905, 2024.
Abstract
Causal discovery methods commonly assume that all data is independently and identically distributed (i.i.d.) and that there are no unmeasured confounding variables. In practice, neither is likely to hold, and detecting confounding in non-i.i.d. settings poses a significant challenge. Motivated by this, we explore how to discover confounders from data in multiple environments with causal mechanism shifts. We show that the mechanism changes of observed variables can reveal which variable sets are confounded. Based on this idea, we propose an empirically testable criterion based on mutual information, show under which conditions it can identify confounding, and introduce CoCo to discover confounders from data in multiple contexts. In our experiments, we show that CoCo works well on synthetic and real-world data.