Calibrated Prediction with Covariate Shift via Unsupervised Domain Adaptation

Sangdon Park, Osbert Bastani, James Weimer, Insup Lee
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:3219-3229, 2020.

Abstract

Reliable uncertainty estimates are an important tool for helping autonomous agents or human decision makers understand and lever-age predictive models. However, existing approaches to estimating uncertainty largely ignore the possibility of covariate shift—i.e.,where the real-world data distribution may differ from the training distribution. As a consequence, existing algorithms can overestimate certainty, possibly yielding a false sense of confidence in the predictive model. We pro-pose an algorithm for calibrating predictions that accounts for the possibility of covariate shift, given labeled examples from the train-ing distribution and unlabeled examples from the real-world distribution. Our algorithm uses importance weighting to correct for the shift from the training to the real-world distribution. However, importance weighting relies on the training and real-world distributions to be sufficiently close. Building on ideas from domain adaptation, we additionally learn a feature map that tries to equalize these two distributions. In an empirical evaluation, we show that our proposed approach outperforms existing approaches to calibrated prediction when there is covariate shift.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-park20b, title = {Calibrated Prediction with Covariate Shift via Unsupervised Domain Adaptation}, author = {Park, Sangdon and Bastani, Osbert and Weimer, James and Lee, Insup}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {3219--3229}, year = {2020}, editor = {Silvia Chiappa and Roberto Calandra}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/park20b/park20b.pdf}, url = { http://proceedings.mlr.press/v108/park20b.html }, abstract = {Reliable uncertainty estimates are an important tool for helping autonomous agents or human decision makers understand and lever-age predictive models. However, existing approaches to estimating uncertainty largely ignore the possibility of covariate shift—i.e.,where the real-world data distribution may differ from the training distribution. As a consequence, existing algorithms can overestimate certainty, possibly yielding a false sense of confidence in the predictive model. We pro-pose an algorithm for calibrating predictions that accounts for the possibility of covariate shift, given labeled examples from the train-ing distribution and unlabeled examples from the real-world distribution. Our algorithm uses importance weighting to correct for the shift from the training to the real-world distribution. However, importance weighting relies on the training and real-world distributions to be sufficiently close. Building on ideas from domain adaptation, we additionally learn a feature map that tries to equalize these two distributions. In an empirical evaluation, we show that our proposed approach outperforms existing approaches to calibrated prediction when there is covariate shift.} }
Endnote
%0 Conference Paper %T Calibrated Prediction with Covariate Shift via Unsupervised Domain Adaptation %A Sangdon Park %A Osbert Bastani %A James Weimer %A Insup Lee %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-park20b %I PMLR %P 3219--3229 %U http://proceedings.mlr.press/v108/park20b.html %V 108 %X Reliable uncertainty estimates are an important tool for helping autonomous agents or human decision makers understand and lever-age predictive models. However, existing approaches to estimating uncertainty largely ignore the possibility of covariate shift—i.e.,where the real-world data distribution may differ from the training distribution. As a consequence, existing algorithms can overestimate certainty, possibly yielding a false sense of confidence in the predictive model. We pro-pose an algorithm for calibrating predictions that accounts for the possibility of covariate shift, given labeled examples from the train-ing distribution and unlabeled examples from the real-world distribution. Our algorithm uses importance weighting to correct for the shift from the training to the real-world distribution. However, importance weighting relies on the training and real-world distributions to be sufficiently close. Building on ideas from domain adaptation, we additionally learn a feature map that tries to equalize these two distributions. In an empirical evaluation, we show that our proposed approach outperforms existing approaches to calibrated prediction when there is covariate shift.
APA
Park, S., Bastani, O., Weimer, J. & Lee, I.. (2020). Calibrated Prediction with Covariate Shift via Unsupervised Domain Adaptation. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:3219-3229 Available from http://proceedings.mlr.press/v108/park20b.html .

Related Material