Fundamental Properties of Causal Entropy and Information Gain

Francisco N. F. Q. Simoes, Mehdi Dastani, Thijs van Ommen
Proceedings of the Third Conference on Causal Learning and Reasoning, PMLR 236:188-208, 2024.

Abstract

Recent developments enable the quantification of causal control given a structural causal model (SCM). This has been accomplished by introducing quantities which encode changes in the entropy of one variable when intervening on another. These measures, named causal entropy and causal information gain, aim to address limitations in existing information theoretical approaches for machine learning tasks where causality plays a crucial role. They have not yet been properly mathematically studied. Our research contributes to the formal understanding of the notions of causal entropy and causal information gain by establishing and analyzing fundamental properties of these concepts, including bounds and chain rules. Furthermore, we elucidate the relationship between causal entropy and stochastic interventions. We also propose definitions for causal conditional entropy and causal conditional information gain. Overall, this exploration paves the way for enhancing causal machine learning tasks through the study of recently-proposed information theoretic quantities grounded in considerations about causality.

Cite this Paper


BibTeX
@InProceedings{pmlr-v236-simoes24a, title = {Fundamental Properties of Causal Entropy and Information Gain}, author = {Simoes, Francisco N. F. Q. and Dastani, Mehdi and Ommen, Thijs van}, booktitle = {Proceedings of the Third Conference on Causal Learning and Reasoning}, pages = {188--208}, year = {2024}, editor = {Locatello, Francesco and Didelez, Vanessa}, volume = {236}, series = {Proceedings of Machine Learning Research}, month = {01--03 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v236/simoes24a/simoes24a.pdf}, url = {https://proceedings.mlr.press/v236/simoes24a.html}, abstract = {Recent developments enable the quantification of causal control given a structural causal model (SCM). This has been accomplished by introducing quantities which encode changes in the entropy of one variable when intervening on another. These measures, named causal entropy and causal information gain, aim to address limitations in existing information theoretical approaches for machine learning tasks where causality plays a crucial role. They have not yet been properly mathematically studied. Our research contributes to the formal understanding of the notions of causal entropy and causal information gain by establishing and analyzing fundamental properties of these concepts, including bounds and chain rules. Furthermore, we elucidate the relationship between causal entropy and stochastic interventions. We also propose definitions for causal conditional entropy and causal conditional information gain. Overall, this exploration paves the way for enhancing causal machine learning tasks through the study of recently-proposed information theoretic quantities grounded in considerations about causality.} }
Endnote
%0 Conference Paper %T Fundamental Properties of Causal Entropy and Information Gain %A Francisco N. F. Q. Simoes %A Mehdi Dastani %A Thijs van Ommen %B Proceedings of the Third Conference on Causal Learning and Reasoning %C Proceedings of Machine Learning Research %D 2024 %E Francesco Locatello %E Vanessa Didelez %F pmlr-v236-simoes24a %I PMLR %P 188--208 %U https://proceedings.mlr.press/v236/simoes24a.html %V 236 %X Recent developments enable the quantification of causal control given a structural causal model (SCM). This has been accomplished by introducing quantities which encode changes in the entropy of one variable when intervening on another. These measures, named causal entropy and causal information gain, aim to address limitations in existing information theoretical approaches for machine learning tasks where causality plays a crucial role. They have not yet been properly mathematically studied. Our research contributes to the formal understanding of the notions of causal entropy and causal information gain by establishing and analyzing fundamental properties of these concepts, including bounds and chain rules. Furthermore, we elucidate the relationship between causal entropy and stochastic interventions. We also propose definitions for causal conditional entropy and causal conditional information gain. Overall, this exploration paves the way for enhancing causal machine learning tasks through the study of recently-proposed information theoretic quantities grounded in considerations about causality.
APA
Simoes, F.N.F.Q., Dastani, M. & Ommen, T.v.. (2024). Fundamental Properties of Causal Entropy and Information Gain. Proceedings of the Third Conference on Causal Learning and Reasoning, in Proceedings of Machine Learning Research 236:188-208 Available from https://proceedings.mlr.press/v236/simoes24a.html.

Related Material