Generalization Bounds for Causal Regression: Insights, Guarantees and Sensitivity Analysis

Daniel Csillag, Claudio Jose Struchiner, Guilherme Tegoni Goedert
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:9576-9602, 2024.

Abstract

Many algorithms have been recently proposed for causal machine learning. Yet, there is little to no theory on their quality, especially considering finite samples. In this work, we propose a theory based on generalization bounds that provides such guarantees. By introducing a novel change-of-measure inequality, we are able to tightly bound the model loss in terms of the deviation of the treatment propensities over the population, which we show can be empirically limited. Our theory is fully rigorous and holds even in the face of hidden confounding and violations of positivity. We demonstrate our bounds on semi-synthetic and real data, showcasing their remarkable tightness and practical utility.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-csillag24a, title = {Generalization Bounds for Causal Regression: Insights, Guarantees and Sensitivity Analysis}, author = {Csillag, Daniel and Struchiner, Claudio Jose and Goedert, Guilherme Tegoni}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {9576--9602}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/csillag24a/csillag24a.pdf}, url = {https://proceedings.mlr.press/v235/csillag24a.html}, abstract = {Many algorithms have been recently proposed for causal machine learning. Yet, there is little to no theory on their quality, especially considering finite samples. In this work, we propose a theory based on generalization bounds that provides such guarantees. By introducing a novel change-of-measure inequality, we are able to tightly bound the model loss in terms of the deviation of the treatment propensities over the population, which we show can be empirically limited. Our theory is fully rigorous and holds even in the face of hidden confounding and violations of positivity. We demonstrate our bounds on semi-synthetic and real data, showcasing their remarkable tightness and practical utility.} }
Endnote
%0 Conference Paper %T Generalization Bounds for Causal Regression: Insights, Guarantees and Sensitivity Analysis %A Daniel Csillag %A Claudio Jose Struchiner %A Guilherme Tegoni Goedert %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-csillag24a %I PMLR %P 9576--9602 %U https://proceedings.mlr.press/v235/csillag24a.html %V 235 %X Many algorithms have been recently proposed for causal machine learning. Yet, there is little to no theory on their quality, especially considering finite samples. In this work, we propose a theory based on generalization bounds that provides such guarantees. By introducing a novel change-of-measure inequality, we are able to tightly bound the model loss in terms of the deviation of the treatment propensities over the population, which we show can be empirically limited. Our theory is fully rigorous and holds even in the face of hidden confounding and violations of positivity. We demonstrate our bounds on semi-synthetic and real data, showcasing their remarkable tightness and practical utility.
APA
Csillag, D., Struchiner, C.J. & Goedert, G.T.. (2024). Generalization Bounds for Causal Regression: Insights, Guarantees and Sensitivity Analysis. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:9576-9602 Available from https://proceedings.mlr.press/v235/csillag24a.html.

Related Material