AGM-TE: Approximate Generative Model Estimator of Transfer Entropy for Causal Discovery

Daniel Kornai, Ricardo Silva, Nikolaos Nikolaou
Proceedings of the Fourth Conference on Causal Learning and Reasoning, PMLR 275:947-990, 2025.

Abstract

The discovery of causal interactions from time series data is an increasingly common approach in science and engineering. Many of the approaches for solving it rely on an information-theoretic measure called transfer entropy [TE] to infer directed causal interactions. However, TE is difficult to estimate from empirical data, as non-parametric methods are hindered by the curse of dimensionality, while existing ML methods suffer from slow convergence or overfitting. In this work, we introduce AGM-TE, a novel ML method that estimates TE using the difference in the predictive capabilities of two alternative probabilistic forecasting models. In a comprehensive suite of TE estimation benchmarks [with 100+ tasks], AGM-TE achieves SoTA results in terms of accuracy and data efficiency when compared to existing non-parametric and ML estimators. AGM-TE further differentiates itself with the ability to estimate conditional transfer entropy, which helps mitigate the effect of confounding variables in systems with many interacting components. We demonstrate the strengths of our approach empirically by recovering patterns of brain connectivity from 250+ dimensional spike data that are consistent with known neuroanatomical results. Overall, we believe AGM-TE represents a significant step forward in the application of transfer entropy to problems of causal discovery from observational time series data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v275-kornai25a, title = {AGM-TE: Approximate Generative Model Estimator of Transfer Entropy for Causal Discovery}, author = {Kornai, Daniel and Silva, Ricardo and Nikolaou, Nikolaos}, booktitle = {Proceedings of the Fourth Conference on Causal Learning and Reasoning}, pages = {947--990}, year = {2025}, editor = {Huang, Biwei and Drton, Mathias}, volume = {275}, series = {Proceedings of Machine Learning Research}, month = {07--09 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v275/main/assets/kornai25a/kornai25a.pdf}, url = {https://proceedings.mlr.press/v275/kornai25a.html}, abstract = {The discovery of causal interactions from time series data is an increasingly common approach in science and engineering. Many of the approaches for solving it rely on an information-theoretic measure called transfer entropy [TE] to infer directed causal interactions. However, TE is difficult to estimate from empirical data, as non-parametric methods are hindered by the curse of dimensionality, while existing ML methods suffer from slow convergence or overfitting. In this work, we introduce AGM-TE, a novel ML method that estimates TE using the difference in the predictive capabilities of two alternative probabilistic forecasting models. In a comprehensive suite of TE estimation benchmarks [with 100+ tasks], AGM-TE achieves SoTA results in terms of accuracy and data efficiency when compared to existing non-parametric and ML estimators. AGM-TE further differentiates itself with the ability to estimate conditional transfer entropy, which helps mitigate the effect of confounding variables in systems with many interacting components. We demonstrate the strengths of our approach empirically by recovering patterns of brain connectivity from 250+ dimensional spike data that are consistent with known neuroanatomical results. Overall, we believe AGM-TE represents a significant step forward in the application of transfer entropy to problems of causal discovery from observational time series data.} }
Endnote
%0 Conference Paper %T AGM-TE: Approximate Generative Model Estimator of Transfer Entropy for Causal Discovery %A Daniel Kornai %A Ricardo Silva %A Nikolaos Nikolaou %B Proceedings of the Fourth Conference on Causal Learning and Reasoning %C Proceedings of Machine Learning Research %D 2025 %E Biwei Huang %E Mathias Drton %F pmlr-v275-kornai25a %I PMLR %P 947--990 %U https://proceedings.mlr.press/v275/kornai25a.html %V 275 %X The discovery of causal interactions from time series data is an increasingly common approach in science and engineering. Many of the approaches for solving it rely on an information-theoretic measure called transfer entropy [TE] to infer directed causal interactions. However, TE is difficult to estimate from empirical data, as non-parametric methods are hindered by the curse of dimensionality, while existing ML methods suffer from slow convergence or overfitting. In this work, we introduce AGM-TE, a novel ML method that estimates TE using the difference in the predictive capabilities of two alternative probabilistic forecasting models. In a comprehensive suite of TE estimation benchmarks [with 100+ tasks], AGM-TE achieves SoTA results in terms of accuracy and data efficiency when compared to existing non-parametric and ML estimators. AGM-TE further differentiates itself with the ability to estimate conditional transfer entropy, which helps mitigate the effect of confounding variables in systems with many interacting components. We demonstrate the strengths of our approach empirically by recovering patterns of brain connectivity from 250+ dimensional spike data that are consistent with known neuroanatomical results. Overall, we believe AGM-TE represents a significant step forward in the application of transfer entropy to problems of causal discovery from observational time series data.
APA
Kornai, D., Silva, R. & Nikolaou, N.. (2025). AGM-TE: Approximate Generative Model Estimator of Transfer Entropy for Causal Discovery. Proceedings of the Fourth Conference on Causal Learning and Reasoning, in Proceedings of Machine Learning Research 275:947-990 Available from https://proceedings.mlr.press/v275/kornai25a.html.

Related Material