[edit]
Diagnostic Uncertainty Calibration: Towards Reliable Machine Predictions in Medical Domain
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:3664-3672, 2021.
Abstract
We propose an evaluation framework for class probability estimates (CPEs) in the presence of label uncertainty, which is commonly observed as diagnosis disagreement between experts in the medical domain. We also formalize evaluation metrics for higher-order statistics, including inter-rater disagreement, to assess predictions on label uncertainty. Moreover, we propose a novel post-hoc method called alpha-calibration, that equips neural network classifiers with calibrated distributions over CPEs. Using synthetic experiments and a large-scale medical imaging application, we show that our approach significantly enhances the reliability of uncertainty estimates: disagreement probabilities and posterior CPEs.