[edit]
Calibrated Prediction with Covariate Shift via Unsupervised Domain Adaptation
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:3219-3229, 2020.
Abstract
Reliable uncertainty estimates are an important tool for helping autonomous agents or human decision makers understand and lever-age predictive models. However, existing approaches to estimating uncertainty largely ignore the possibility of covariate shift—i.e.,where the real-world data distribution may differ from the training distribution. As a consequence, existing algorithms can overestimate certainty, possibly yielding a false sense of confidence in the predictive model. We pro-pose an algorithm for calibrating predictions that accounts for the possibility of covariate shift, given labeled examples from the train-ing distribution and unlabeled examples from the real-world distribution. Our algorithm uses importance weighting to correct for the shift from the training to the real-world distribution. However, importance weighting relies on the training and real-world distributions to be sufficiently close. Building on ideas from domain adaptation, we additionally learn a feature map that tries to equalize these two distributions. In an empirical evaluation, we show that our proposed approach outperforms existing approaches to calibrated prediction when there is covariate shift.