Outcome Assumptions and Duality Theory for Balancing Weights

David A. Bruns-Smith, Avi Feller
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:11037-11055, 2022.

Abstract

We study balancing weight estimators, which reweight outcomes from a source population to estimate missing outcomes in a target population. These estimators minimize the worst-case error by making an assumption about the outcome model. In this paper, we show that this outcome assumption has two immediate implications. First, we can replace the minimax optimization problem for balancing weights with a simple convex loss over the assumed outcome function class. Second, we can replace the commonly-made overlap assumption with a more appropriate quantitative measure, the minimum worst-case bias. Finally, we show conditions under which the weights remain robust when our assumptions on the outcomes are wrong.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-bruns-smith22a, title = { Outcome Assumptions and Duality Theory for Balancing Weights }, author = {Bruns-Smith, David A. and Feller, Avi}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {11037--11055}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/bruns-smith22a/bruns-smith22a.pdf}, url = {https://proceedings.mlr.press/v151/bruns-smith22a.html}, abstract = { We study balancing weight estimators, which reweight outcomes from a source population to estimate missing outcomes in a target population. These estimators minimize the worst-case error by making an assumption about the outcome model. In this paper, we show that this outcome assumption has two immediate implications. First, we can replace the minimax optimization problem for balancing weights with a simple convex loss over the assumed outcome function class. Second, we can replace the commonly-made overlap assumption with a more appropriate quantitative measure, the minimum worst-case bias. Finally, we show conditions under which the weights remain robust when our assumptions on the outcomes are wrong. } }
Endnote
%0 Conference Paper %T Outcome Assumptions and Duality Theory for Balancing Weights %A David A. Bruns-Smith %A Avi Feller %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-bruns-smith22a %I PMLR %P 11037--11055 %U https://proceedings.mlr.press/v151/bruns-smith22a.html %V 151 %X We study balancing weight estimators, which reweight outcomes from a source population to estimate missing outcomes in a target population. These estimators minimize the worst-case error by making an assumption about the outcome model. In this paper, we show that this outcome assumption has two immediate implications. First, we can replace the minimax optimization problem for balancing weights with a simple convex loss over the assumed outcome function class. Second, we can replace the commonly-made overlap assumption with a more appropriate quantitative measure, the minimum worst-case bias. Finally, we show conditions under which the weights remain robust when our assumptions on the outcomes are wrong.
APA
Bruns-Smith, D.A. & Feller, A.. (2022). Outcome Assumptions and Duality Theory for Balancing Weights . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:11037-11055 Available from https://proceedings.mlr.press/v151/bruns-smith22a.html.

Related Material