Conditional Distributional Treatment Effect with Kernel Conditional Mean Embeddings and U-Statistic Regression

Junhyung Park, Uri Shalit, Bernhard Schölkopf, Krikamol Muandet
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:8401-8412, 2021.

Abstract

We propose to analyse the conditional distributional treatment effect (CoDiTE), which, in contrast to the more common conditional average treatment effect (CATE), is designed to encode a treatment’s distributional aspects beyond the mean. We first introduce a formal definition of the CoDiTE associated with a distance function between probability measures. Then we discuss the CoDiTE associated with the maximum mean discrepancy via kernel conditional mean embeddings, which, coupled with a hypothesis test, tells us whether there is any conditional distributional effect of the treatment. Finally, we investigate what kind of conditional distributional effect the treatment has, both in an exploratory manner via the conditional witness function, and in a quantitative manner via U-statistic regression, generalising the CATE to higher-order moments. Experiments on synthetic, semi-synthetic and real datasets demonstrate the merits of our approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-park21c, title = {Conditional Distributional Treatment Effect with Kernel Conditional Mean Embeddings and U-Statistic Regression}, author = {Park, Junhyung and Shalit, Uri and Sch{\"o}lkopf, Bernhard and Muandet, Krikamol}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {8401--8412}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/park21c/park21c.pdf}, url = {https://proceedings.mlr.press/v139/park21c.html}, abstract = {We propose to analyse the conditional distributional treatment effect (CoDiTE), which, in contrast to the more common conditional average treatment effect (CATE), is designed to encode a treatment’s distributional aspects beyond the mean. We first introduce a formal definition of the CoDiTE associated with a distance function between probability measures. Then we discuss the CoDiTE associated with the maximum mean discrepancy via kernel conditional mean embeddings, which, coupled with a hypothesis test, tells us whether there is any conditional distributional effect of the treatment. Finally, we investigate what kind of conditional distributional effect the treatment has, both in an exploratory manner via the conditional witness function, and in a quantitative manner via U-statistic regression, generalising the CATE to higher-order moments. Experiments on synthetic, semi-synthetic and real datasets demonstrate the merits of our approach.} }
Endnote
%0 Conference Paper %T Conditional Distributional Treatment Effect with Kernel Conditional Mean Embeddings and U-Statistic Regression %A Junhyung Park %A Uri Shalit %A Bernhard Schölkopf %A Krikamol Muandet %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-park21c %I PMLR %P 8401--8412 %U https://proceedings.mlr.press/v139/park21c.html %V 139 %X We propose to analyse the conditional distributional treatment effect (CoDiTE), which, in contrast to the more common conditional average treatment effect (CATE), is designed to encode a treatment’s distributional aspects beyond the mean. We first introduce a formal definition of the CoDiTE associated with a distance function between probability measures. Then we discuss the CoDiTE associated with the maximum mean discrepancy via kernel conditional mean embeddings, which, coupled with a hypothesis test, tells us whether there is any conditional distributional effect of the treatment. Finally, we investigate what kind of conditional distributional effect the treatment has, both in an exploratory manner via the conditional witness function, and in a quantitative manner via U-statistic regression, generalising the CATE to higher-order moments. Experiments on synthetic, semi-synthetic and real datasets demonstrate the merits of our approach.
APA
Park, J., Shalit, U., Schölkopf, B. & Muandet, K.. (2021). Conditional Distributional Treatment Effect with Kernel Conditional Mean Embeddings and U-Statistic Regression. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:8401-8412 Available from https://proceedings.mlr.press/v139/park21c.html.

Related Material