Stochastic Tree Ensembles for Estimating Heterogeneous Effects

Nikolay Krantsevich, Jingyu He, P. Richard Hahn
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:6120-6131, 2023.

Abstract

Determining subgroups that respond especially well (or poorly) to specific interventions (medical or policy) requires new supervised learning methods tailored specifically for causal inference. Bayesian Causal Forest (BCF) is a recent method that has been documented to perform well on data generating processes with strong confounding of the sort that is plausible in many applications. This paper develops a novel algorithm for fitting the BCF model, which is more efficient than the previous Gibbs sampler. The new algorithm can be used to initialize independent chains of the existing Gibbs sampler leading to better posterior exploration and coverage of the associated interval estimates in simulation studies. The new algorithm is compared to related approaches via simulation studies as well as an empirical analysis.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-krantsevich23a, title = {Stochastic Tree Ensembles for Estimating Heterogeneous Effects}, author = {Krantsevich, Nikolay and He, Jingyu and Hahn, P. Richard}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {6120--6131}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/krantsevich23a/krantsevich23a.pdf}, url = {https://proceedings.mlr.press/v206/krantsevich23a.html}, abstract = {Determining subgroups that respond especially well (or poorly) to specific interventions (medical or policy) requires new supervised learning methods tailored specifically for causal inference. Bayesian Causal Forest (BCF) is a recent method that has been documented to perform well on data generating processes with strong confounding of the sort that is plausible in many applications. This paper develops a novel algorithm for fitting the BCF model, which is more efficient than the previous Gibbs sampler. The new algorithm can be used to initialize independent chains of the existing Gibbs sampler leading to better posterior exploration and coverage of the associated interval estimates in simulation studies. The new algorithm is compared to related approaches via simulation studies as well as an empirical analysis.} }
Endnote
%0 Conference Paper %T Stochastic Tree Ensembles for Estimating Heterogeneous Effects %A Nikolay Krantsevich %A Jingyu He %A P. Richard Hahn %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-krantsevich23a %I PMLR %P 6120--6131 %U https://proceedings.mlr.press/v206/krantsevich23a.html %V 206 %X Determining subgroups that respond especially well (or poorly) to specific interventions (medical or policy) requires new supervised learning methods tailored specifically for causal inference. Bayesian Causal Forest (BCF) is a recent method that has been documented to perform well on data generating processes with strong confounding of the sort that is plausible in many applications. This paper develops a novel algorithm for fitting the BCF model, which is more efficient than the previous Gibbs sampler. The new algorithm can be used to initialize independent chains of the existing Gibbs sampler leading to better posterior exploration and coverage of the associated interval estimates in simulation studies. The new algorithm is compared to related approaches via simulation studies as well as an empirical analysis.
APA
Krantsevich, N., He, J. & Hahn, P.R.. (2023). Stochastic Tree Ensembles for Estimating Heterogeneous Effects. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:6120-6131 Available from https://proceedings.mlr.press/v206/krantsevich23a.html.

Related Material