Confidence Classifiers with Guaranteed Accuracy or Precision

Ulf Johansson, Cecilia Sonstrod, Tuwe Lofstrom, Henrik Bostrom
Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications, PMLR 204:513-533, 2023.

Abstract

In many situations, probabilistic predictors have replaced conformal classifiers. The main reason is arguably that the set predictions of conformal classifiers, with the accompanying significance level, are hard to interpret. In this paper, we demonstrate how conformal classification can be used as a basis for a classifier with reject option. Specifically, we introduce and evaluate two algorithms that are able to perfectly estimate accuracy or precision for a set of test instances, in a classifier with reject scenario. In the empirical investigation, the suggested algorithms are shown to clearly outperform both calibrated and uncalibrated probabilistic predictors.

Cite this Paper


BibTeX
@InProceedings{pmlr-v204-johansson23a, title = {Confidence Classifiers with Guaranteed Accuracy or Precision}, author = {Johansson, Ulf and Sonstrod, Cecilia and Lofstrom, Tuwe and Bostrom, Henrik}, booktitle = {Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications}, pages = {513--533}, year = {2023}, editor = {Papadopoulos, Harris and Nguyen, Khuong An and Boström, Henrik and Carlsson, Lars}, volume = {204}, series = {Proceedings of Machine Learning Research}, month = {13--15 Sep}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v204/johansson23a/johansson23a.pdf}, url = {https://proceedings.mlr.press/v204/johansson23a.html}, abstract = {In many situations, probabilistic predictors have replaced conformal classifiers. The main reason is arguably that the set predictions of conformal classifiers, with the accompanying significance level, are hard to interpret. In this paper, we demonstrate how conformal classification can be used as a basis for a classifier with reject option. Specifically, we introduce and evaluate two algorithms that are able to perfectly estimate accuracy or precision for a set of test instances, in a classifier with reject scenario. In the empirical investigation, the suggested algorithms are shown to clearly outperform both calibrated and uncalibrated probabilistic predictors.} }
Endnote
%0 Conference Paper %T Confidence Classifiers with Guaranteed Accuracy or Precision %A Ulf Johansson %A Cecilia Sonstrod %A Tuwe Lofstrom %A Henrik Bostrom %B Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications %C Proceedings of Machine Learning Research %D 2023 %E Harris Papadopoulos %E Khuong An Nguyen %E Henrik Boström %E Lars Carlsson %F pmlr-v204-johansson23a %I PMLR %P 513--533 %U https://proceedings.mlr.press/v204/johansson23a.html %V 204 %X In many situations, probabilistic predictors have replaced conformal classifiers. The main reason is arguably that the set predictions of conformal classifiers, with the accompanying significance level, are hard to interpret. In this paper, we demonstrate how conformal classification can be used as a basis for a classifier with reject option. Specifically, we introduce and evaluate two algorithms that are able to perfectly estimate accuracy or precision for a set of test instances, in a classifier with reject scenario. In the empirical investigation, the suggested algorithms are shown to clearly outperform both calibrated and uncalibrated probabilistic predictors.
APA
Johansson, U., Sonstrod, C., Lofstrom, T. & Bostrom, H.. (2023). Confidence Classifiers with Guaranteed Accuracy or Precision. Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications, in Proceedings of Machine Learning Research 204:513-533 Available from https://proceedings.mlr.press/v204/johansson23a.html.

Related Material