Almost optimal intervention sets for causal discovery

Frederick Eberhardt
Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, PMLR R6:161-168, 2008.

Abstract

We conjecture that the worst case number of experiments necessary and sufficient to discover a causal graph uniquely given its observational Markov equivalence class can be specified as a function of the largest clique in the Markov equivalence class. We provide an algorithm that computes intervention sets that we believe are optimal for the above task. The algorithm builds on insights gained from the worst case analysis in Eberhardt et al. (2005) for sequences of experiments when all possible directed acyclic graphs over N variables are considered. A simulation suggests that our conjecture is correct. We also show that a generalization of our conjecture to other classes of possible graph hypotheses cannot be given easily, and in what sense the algorithm is then no longer optimal.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR6-eberhardt08a, title = {Almost optimal intervention sets for causal discovery}, author = {Eberhardt, Frederick}, booktitle = {Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence}, pages = {161--168}, year = {2008}, editor = {McAllester, David A. and Myllymäki, Petri}, volume = {R6}, series = {Proceedings of Machine Learning Research}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/r6/main/assets/eberhardt08a/eberhardt08a.pdf}, url = {https://proceedings.mlr.press/r6/eberhardt08a.html}, abstract = {We conjecture that the worst case number of experiments necessary and sufficient to discover a causal graph uniquely given its observational Markov equivalence class can be specified as a function of the largest clique in the Markov equivalence class. We provide an algorithm that computes intervention sets that we believe are optimal for the above task. The algorithm builds on insights gained from the worst case analysis in Eberhardt et al. (2005) for sequences of experiments when all possible directed acyclic graphs over N variables are considered. A simulation suggests that our conjecture is correct. We also show that a generalization of our conjecture to other classes of possible graph hypotheses cannot be given easily, and in what sense the algorithm is then no longer optimal.}, note = {Reissued by PMLR on 09 October 2024.} }
Endnote
%0 Conference Paper %T Almost optimal intervention sets for causal discovery %A Frederick Eberhardt %B Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2008 %E David A. McAllester %E Petri Myllymäki %F pmlr-vR6-eberhardt08a %I PMLR %P 161--168 %U https://proceedings.mlr.press/r6/eberhardt08a.html %V R6 %X We conjecture that the worst case number of experiments necessary and sufficient to discover a causal graph uniquely given its observational Markov equivalence class can be specified as a function of the largest clique in the Markov equivalence class. We provide an algorithm that computes intervention sets that we believe are optimal for the above task. The algorithm builds on insights gained from the worst case analysis in Eberhardt et al. (2005) for sequences of experiments when all possible directed acyclic graphs over N variables are considered. A simulation suggests that our conjecture is correct. We also show that a generalization of our conjecture to other classes of possible graph hypotheses cannot be given easily, and in what sense the algorithm is then no longer optimal. %Z Reissued by PMLR on 09 October 2024.
APA
Eberhardt, F.. (2008). Almost optimal intervention sets for causal discovery. Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research R6:161-168 Available from https://proceedings.mlr.press/r6/eberhardt08a.html. Reissued by PMLR on 09 October 2024.

Related Material