A Recipe for Causal Graph Regression: Confounding Effects Revisited

Yujia Yin, Tianyi Qu, Zihao Wang, Yifan Chen
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:72414-72427, 2025.

Abstract

Through recognizing causal subgraphs, causal graph learning (CGL) has risen to be a promising approach for improving the generalizability of graph neural networks under out-of-distribution (OOD) scenarios. However, the empirical successes of CGL techniques are mostly exemplified in classification settings, while regression tasks, a more challenging setting in graph learning, are overlooked. We thus devote this work to tackling causal graph regression (CGR); to this end we reshape the processing of confounding effects in existing CGL studies, which mainly deal with classification. Specifically, we reflect on the predictive power of confounders in graph-level regression, and generalize classification-specific causal intervention techniques to regression through a lens of contrastive learning. Extensive experiments on graph OOD benchmarks validate the efficacy of our proposals for CGR. The model implementation and the code are provided on https://github.com/causal-graph/CGR.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-yin25d, title = {A Recipe for Causal Graph Regression: Confounding Effects Revisited}, author = {Yin, Yujia and Qu, Tianyi and Wang, Zihao and Chen, Yifan}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {72414--72427}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/yin25d/yin25d.pdf}, url = {https://proceedings.mlr.press/v267/yin25d.html}, abstract = {Through recognizing causal subgraphs, causal graph learning (CGL) has risen to be a promising approach for improving the generalizability of graph neural networks under out-of-distribution (OOD) scenarios. However, the empirical successes of CGL techniques are mostly exemplified in classification settings, while regression tasks, a more challenging setting in graph learning, are overlooked. We thus devote this work to tackling causal graph regression (CGR); to this end we reshape the processing of confounding effects in existing CGL studies, which mainly deal with classification. Specifically, we reflect on the predictive power of confounders in graph-level regression, and generalize classification-specific causal intervention techniques to regression through a lens of contrastive learning. Extensive experiments on graph OOD benchmarks validate the efficacy of our proposals for CGR. The model implementation and the code are provided on https://github.com/causal-graph/CGR.} }
Endnote
%0 Conference Paper %T A Recipe for Causal Graph Regression: Confounding Effects Revisited %A Yujia Yin %A Tianyi Qu %A Zihao Wang %A Yifan Chen %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-yin25d %I PMLR %P 72414--72427 %U https://proceedings.mlr.press/v267/yin25d.html %V 267 %X Through recognizing causal subgraphs, causal graph learning (CGL) has risen to be a promising approach for improving the generalizability of graph neural networks under out-of-distribution (OOD) scenarios. However, the empirical successes of CGL techniques are mostly exemplified in classification settings, while regression tasks, a more challenging setting in graph learning, are overlooked. We thus devote this work to tackling causal graph regression (CGR); to this end we reshape the processing of confounding effects in existing CGL studies, which mainly deal with classification. Specifically, we reflect on the predictive power of confounders in graph-level regression, and generalize classification-specific causal intervention techniques to regression through a lens of contrastive learning. Extensive experiments on graph OOD benchmarks validate the efficacy of our proposals for CGR. The model implementation and the code are provided on https://github.com/causal-graph/CGR.
APA
Yin, Y., Qu, T., Wang, Z. & Chen, Y.. (2025). A Recipe for Causal Graph Regression: Confounding Effects Revisited. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:72414-72427 Available from https://proceedings.mlr.press/v267/yin25d.html.

Related Material