Reducing Balancing Error for Causal Inference via Optimal Transport

Yuguang Yan, Hao Zhou, Zeqin Yang, Weilin Chen, Ruichu Cai, Zhifeng Hao
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:55913-55927, 2024.

Abstract

Most studies on causal inference tackle the issue of confounding bias by reducing the distribution shift between the control and treated groups. However, it remains an open question to adopt an appropriate metric for distribution shift in practice. In this paper, we define a generic balancing error on reweighted samples to characterize the confounding bias, and study the connection between the balancing error and the Wasserstein discrepancy derived from the theory of optimal transport. We not only regard the Wasserstein discrepancy as the metric of distribution shift, but also explore the association between the balancing error and the underlying cost function involved in the Wasserstein discrepancy. Motivated by this, we propose to reduce the balancing error under the framework of optimal transport with learnable marginal distributions and the cost function, which is implemented by jointly learning weights and representations associated with factual outcomes. The experiments on both synthetic and real-world datasets demonstrate the effectiveness of our proposed method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-yan24i, title = {Reducing Balancing Error for Causal Inference via Optimal Transport}, author = {Yan, Yuguang and Zhou, Hao and Yang, Zeqin and Chen, Weilin and Cai, Ruichu and Hao, Zhifeng}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {55913--55927}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/yan24i/yan24i.pdf}, url = {https://proceedings.mlr.press/v235/yan24i.html}, abstract = {Most studies on causal inference tackle the issue of confounding bias by reducing the distribution shift between the control and treated groups. However, it remains an open question to adopt an appropriate metric for distribution shift in practice. In this paper, we define a generic balancing error on reweighted samples to characterize the confounding bias, and study the connection between the balancing error and the Wasserstein discrepancy derived from the theory of optimal transport. We not only regard the Wasserstein discrepancy as the metric of distribution shift, but also explore the association between the balancing error and the underlying cost function involved in the Wasserstein discrepancy. Motivated by this, we propose to reduce the balancing error under the framework of optimal transport with learnable marginal distributions and the cost function, which is implemented by jointly learning weights and representations associated with factual outcomes. The experiments on both synthetic and real-world datasets demonstrate the effectiveness of our proposed method.} }
Endnote
%0 Conference Paper %T Reducing Balancing Error for Causal Inference via Optimal Transport %A Yuguang Yan %A Hao Zhou %A Zeqin Yang %A Weilin Chen %A Ruichu Cai %A Zhifeng Hao %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-yan24i %I PMLR %P 55913--55927 %U https://proceedings.mlr.press/v235/yan24i.html %V 235 %X Most studies on causal inference tackle the issue of confounding bias by reducing the distribution shift between the control and treated groups. However, it remains an open question to adopt an appropriate metric for distribution shift in practice. In this paper, we define a generic balancing error on reweighted samples to characterize the confounding bias, and study the connection between the balancing error and the Wasserstein discrepancy derived from the theory of optimal transport. We not only regard the Wasserstein discrepancy as the metric of distribution shift, but also explore the association between the balancing error and the underlying cost function involved in the Wasserstein discrepancy. Motivated by this, we propose to reduce the balancing error under the framework of optimal transport with learnable marginal distributions and the cost function, which is implemented by jointly learning weights and representations associated with factual outcomes. The experiments on both synthetic and real-world datasets demonstrate the effectiveness of our proposed method.
APA
Yan, Y., Zhou, H., Yang, Z., Chen, W., Cai, R. & Hao, Z.. (2024). Reducing Balancing Error for Causal Inference via Optimal Transport. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:55913-55927 Available from https://proceedings.mlr.press/v235/yan24i.html.

Related Material