$Ψ$net: Efficient Causal Modeling at Scale

Florian Peter Busch, Moritz Willig, Jonas Seng, Kristian Kersting, Devendra Singh Dhami
Proceedings of The 12th International Conference on Probabilistic Graphical Models, PMLR 246:452-469, 2024.

Abstract

Being a ubiquitous aspect of human cognition, causality has made its way into modern-day machine-learning research. Despite its importance in real-world applications, contemporary research still struggles with high-dimensional causal problems. Leveraging the efficiency of probabilistic circuits, which offer tractable computation of marginal probabilities, we introduce $\Psi$net, a probabilistic model designed for large-scale causal inference. $\Psi$net is a type of sum-product network where layering and the einsum operation allow for efficient parallelization. By incorporating interventional data into the learning process, the model can learn the effects of interventions and make predictions based on the specific interventional setting. Overall, $\Psi$net is a causal probabilistic circuit that efficiently answers causal queries in large-scale problems. We present evaluations conducted on both synthetic data and a substantial real-world dataset, demonstrating $\Psi$net’s ability to capture causal relationships in high-dimensional settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v246-busch24a, title = {$Ψ$net: Efficient Causal Modeling at Scale}, author = {Busch, Florian Peter and Willig, Moritz and Seng, Jonas and Kersting, Kristian and Dhami, Devendra Singh}, booktitle = {Proceedings of The 12th International Conference on Probabilistic Graphical Models}, pages = {452--469}, year = {2024}, editor = {Kwisthout, Johan and Renooij, Silja}, volume = {246}, series = {Proceedings of Machine Learning Research}, month = {11--13 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v246/main/assets/busch24a/busch24a.pdf}, url = {https://proceedings.mlr.press/v246/busch24a.html}, abstract = {Being a ubiquitous aspect of human cognition, causality has made its way into modern-day machine-learning research. Despite its importance in real-world applications, contemporary research still struggles with high-dimensional causal problems. Leveraging the efficiency of probabilistic circuits, which offer tractable computation of marginal probabilities, we introduce $\Psi$net, a probabilistic model designed for large-scale causal inference. $\Psi$net is a type of sum-product network where layering and the einsum operation allow for efficient parallelization. By incorporating interventional data into the learning process, the model can learn the effects of interventions and make predictions based on the specific interventional setting. Overall, $\Psi$net is a causal probabilistic circuit that efficiently answers causal queries in large-scale problems. We present evaluations conducted on both synthetic data and a substantial real-world dataset, demonstrating $\Psi$net’s ability to capture causal relationships in high-dimensional settings.} }
Endnote
%0 Conference Paper %T $Ψ$net: Efficient Causal Modeling at Scale %A Florian Peter Busch %A Moritz Willig %A Jonas Seng %A Kristian Kersting %A Devendra Singh Dhami %B Proceedings of The 12th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2024 %E Johan Kwisthout %E Silja Renooij %F pmlr-v246-busch24a %I PMLR %P 452--469 %U https://proceedings.mlr.press/v246/busch24a.html %V 246 %X Being a ubiquitous aspect of human cognition, causality has made its way into modern-day machine-learning research. Despite its importance in real-world applications, contemporary research still struggles with high-dimensional causal problems. Leveraging the efficiency of probabilistic circuits, which offer tractable computation of marginal probabilities, we introduce $\Psi$net, a probabilistic model designed for large-scale causal inference. $\Psi$net is a type of sum-product network where layering and the einsum operation allow for efficient parallelization. By incorporating interventional data into the learning process, the model can learn the effects of interventions and make predictions based on the specific interventional setting. Overall, $\Psi$net is a causal probabilistic circuit that efficiently answers causal queries in large-scale problems. We present evaluations conducted on both synthetic data and a substantial real-world dataset, demonstrating $\Psi$net’s ability to capture causal relationships in high-dimensional settings.
APA
Busch, F.P., Willig, M., Seng, J., Kersting, K. & Dhami, D.S.. (2024). $Ψ$net: Efficient Causal Modeling at Scale. Proceedings of The 12th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 246:452-469 Available from https://proceedings.mlr.press/v246/busch24a.html.

Related Material