An Information Theoretic Framework for Continual Learning of Causal Networks.

Osman Mian, Sarah Mameche
Proceedings of The Second AAAI Bridge Program on Continual Causality, PMLR 268:1-10, 2025.

Abstract

Discovering causal networks, especially from observational data alone, is a fundamental yet challenging task. Existing causal discovery algorithms not only rely on strict assumptions such as having i.i.d data, but are also limited to working with static, fully-specified datasets, rendering them incapable of learning causal networks in a continual fashion. In this short paper, we propose an information-theoretic approach that can learn causal networks in a continual fashion, does not require the i.i.d assumption on continually arriving data, and converges to the true underlying causal network as samples within the accumulated batches of data converge to the underlying data generating distribution. Our proposed approach, ConCausD, leverages the Algorithmic Markov Condition to discover causal networks in an online fashion. ConCausD is not only capable of continual learning, it also provides multiple plausible causal graphs at the end of each iteration, while the existing approaches can only predict a single causal network.

Cite this Paper


BibTeX
@InProceedings{pmlr-v268-mian25a, title = {An Information Theoretic Framework for Continual Learning of Causal Networks.}, author = {Mian, Osman and Mameche, Sarah}, booktitle = {Proceedings of The Second AAAI Bridge Program on Continual Causality}, pages = {1--10}, year = {2025}, editor = {Mundt, Martin and Cooper, Keiland W. and Dhami, Devendra Singh and Hayes, Tyler and Herman, Rebecca and Ribeiro, Adéle and Smith, James Seale}, volume = {268}, series = {Proceedings of Machine Learning Research}, month = {20--21 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v268/main/assets/mian25a/mian25a.pdf}, url = {https://proceedings.mlr.press/v268/mian25a.html}, abstract = {Discovering causal networks, especially from observational data alone, is a fundamental yet challenging task. Existing causal discovery algorithms not only rely on strict assumptions such as having i.i.d data, but are also limited to working with static, fully-specified datasets, rendering them incapable of learning causal networks in a continual fashion. In this short paper, we propose an information-theoretic approach that can learn causal networks in a continual fashion, does not require the i.i.d assumption on continually arriving data, and converges to the true underlying causal network as samples within the accumulated batches of data converge to the underlying data generating distribution. Our proposed approach, ConCausD, leverages the Algorithmic Markov Condition to discover causal networks in an online fashion. ConCausD is not only capable of continual learning, it also provides multiple plausible causal graphs at the end of each iteration, while the existing approaches can only predict a single causal network.} }
Endnote
%0 Conference Paper %T An Information Theoretic Framework for Continual Learning of Causal Networks. %A Osman Mian %A Sarah Mameche %B Proceedings of The Second AAAI Bridge Program on Continual Causality %C Proceedings of Machine Learning Research %D 2025 %E Martin Mundt %E Keiland W. Cooper %E Devendra Singh Dhami %E Tyler Hayes %E Rebecca Herman %E Adéle Ribeiro %E James Seale Smith %F pmlr-v268-mian25a %I PMLR %P 1--10 %U https://proceedings.mlr.press/v268/mian25a.html %V 268 %X Discovering causal networks, especially from observational data alone, is a fundamental yet challenging task. Existing causal discovery algorithms not only rely on strict assumptions such as having i.i.d data, but are also limited to working with static, fully-specified datasets, rendering them incapable of learning causal networks in a continual fashion. In this short paper, we propose an information-theoretic approach that can learn causal networks in a continual fashion, does not require the i.i.d assumption on continually arriving data, and converges to the true underlying causal network as samples within the accumulated batches of data converge to the underlying data generating distribution. Our proposed approach, ConCausD, leverages the Algorithmic Markov Condition to discover causal networks in an online fashion. ConCausD is not only capable of continual learning, it also provides multiple plausible causal graphs at the end of each iteration, while the existing approaches can only predict a single causal network.
APA
Mian, O. & Mameche, S.. (2025). An Information Theoretic Framework for Continual Learning of Causal Networks.. Proceedings of The Second AAAI Bridge Program on Continual Causality, in Proceedings of Machine Learning Research 268:1-10 Available from https://proceedings.mlr.press/v268/mian25a.html.

Related Material