Distribution-Free Calibration Guarantees for Histogram Binning without Sample Splitting

Chirag Gupta, Aaditya Ramdas
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:3942-3952, 2021.

Abstract

We prove calibration guarantees for the popular histogram binning (also called uniform-mass binning) method of Zadrozny and Elkan (2001). Histogram binning has displayed strong practical performance, but theoretical guarantees have only been shown for sample split versions that avoid ’double dipping’ the data. We demonstrate that the statistical cost of sample splitting is practically significant on a credit default dataset. We then prove calibration guarantees for the original method that double dips the data, using a certain Markov property of order statistics. Based on our results, we make practical recommendations for choosing the number of bins in histogram binning. In our illustrative simulations, we propose a new tool for assessing calibration—validity plots—which provide more information than an ECE estimate.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-gupta21b, title = {Distribution-Free Calibration Guarantees for Histogram Binning without Sample Splitting}, author = {Gupta, Chirag and Ramdas, Aaditya}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {3942--3952}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/gupta21b/gupta21b.pdf}, url = {https://proceedings.mlr.press/v139/gupta21b.html}, abstract = {We prove calibration guarantees for the popular histogram binning (also called uniform-mass binning) method of Zadrozny and Elkan (2001). Histogram binning has displayed strong practical performance, but theoretical guarantees have only been shown for sample split versions that avoid ’double dipping’ the data. We demonstrate that the statistical cost of sample splitting is practically significant on a credit default dataset. We then prove calibration guarantees for the original method that double dips the data, using a certain Markov property of order statistics. Based on our results, we make practical recommendations for choosing the number of bins in histogram binning. In our illustrative simulations, we propose a new tool for assessing calibration—validity plots—which provide more information than an ECE estimate.} }
Endnote
%0 Conference Paper %T Distribution-Free Calibration Guarantees for Histogram Binning without Sample Splitting %A Chirag Gupta %A Aaditya Ramdas %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-gupta21b %I PMLR %P 3942--3952 %U https://proceedings.mlr.press/v139/gupta21b.html %V 139 %X We prove calibration guarantees for the popular histogram binning (also called uniform-mass binning) method of Zadrozny and Elkan (2001). Histogram binning has displayed strong practical performance, but theoretical guarantees have only been shown for sample split versions that avoid ’double dipping’ the data. We demonstrate that the statistical cost of sample splitting is practically significant on a credit default dataset. We then prove calibration guarantees for the original method that double dips the data, using a certain Markov property of order statistics. Based on our results, we make practical recommendations for choosing the number of bins in histogram binning. In our illustrative simulations, we propose a new tool for assessing calibration—validity plots—which provide more information than an ECE estimate.
APA
Gupta, C. & Ramdas, A.. (2021). Distribution-Free Calibration Guarantees for Histogram Binning without Sample Splitting. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:3942-3952 Available from https://proceedings.mlr.press/v139/gupta21b.html.

Related Material