[edit]
Training conformal predictors
Proceedings of the Ninth Symposium on Conformal and Probabilistic Prediction and Applications, PMLR 128:55-64, 2020.
Abstract
Efficiency criteria for conformal prediction, such as observed fuzziness (i.e., the sum of p-values associated with false labels), are commonly used to evaluate the performance of given conformal predictors. Here, we investigate whether it is possible to exploit efficiency criteria to learn classifiers, both conformal predictors and point classifiers, by using such criteria as training objective functions. The proposed idea is implemented for the problem of binary classification of hand-written digits. By choosing a 1-dimensional model class (with one real-valued free parameter), we can solve the optimization problems through an (approximate) exhaustive search over (a discrete version of) the parameter space. Our empirical results suggest that conformal predictors trained by minimizing their observed fuzziness perform better than conformal predictors trained in the traditional way by minimizing the prediction error of the corresponding point classifier. They also have reasonable performance in terms of their prediction error on the test set.