[edit]
Multi-class Classification with Reject Option and Performance Guarantees using Conformal Prediction
Proceedings of the Thirteenth Symposium on Conformal and Probabilistic Prediction with Applications, PMLR 230:295-314, 2024.
Abstract
Beyond the standard classification scenario, allowing a classifier to refrain from making a prediction under uncertainty can have advantages in safety-critical applications, where a mistake may hold great costs. In this paper, we extend previous works on the development of classifiers with reject option grounded on the conformal prediction framework. Specifically, our work introduces a novel approach for inducing multi-class classifiers with reliable accuracy or recall estimates for a given rejection rate. We empirically evaluate our suggested approach in six multi-class datasets and demonstrate its effectiveness against both calibrated and uncalibrated probabilistic classifiers. The results underscore our method’s capability to provide reliable error rate estimates, thereby enhancing decision-making processes where erroneous predictions bear critical consequences.