Conformal Prediction of Classifiers with Many Classes based on Noisy Labels

Coby Penso, Jacob Goldberger, Ethan Fetaya
Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications, PMLR 266:82-95, 2025.

Abstract

Conformal Prediction (CP) is a method to control prediction uncertainty by producing a small prediction set, ensuring a predetermined probability that the true class lies within this set. This is commonly done by defining a score, based on the model predictions, and setting a threshold on this score using a validation set. In this study, we address the problem of CP calibration when we only have access to a validation set with noisy labels. We show how we can estimate the noise-free conformal threshold based on the noisy labeled data. We derive a finite-sample coverage guarantee under uniform noise that remains effective even in classification tasks with a large number of classes. We dub our approach Noise-Aware Conformal Prediction (NACP). We illustrate the performance of the proposed results on several standard image classification datasets with a large number of classes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v266-penso25a, title = {Conformal Prediction of Classifiers with Many Classes based on Noisy Labels}, author = {Penso, Coby and Goldberger, Jacob and Fetaya, Ethan}, booktitle = {Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications}, pages = {82--95}, year = {2025}, editor = {Nguyen, Khuong An and Luo, Zhiyuan and Papadopoulos, Harris and Löfström, Tuwe and Carlsson, Lars and Boström, Henrik}, volume = {266}, series = {Proceedings of Machine Learning Research}, month = {10--12 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v266/main/assets/penso25a/penso25a.pdf}, url = {https://proceedings.mlr.press/v266/penso25a.html}, abstract = {Conformal Prediction (CP) is a method to control prediction uncertainty by producing a small prediction set, ensuring a predetermined probability that the true class lies within this set. This is commonly done by defining a score, based on the model predictions, and setting a threshold on this score using a validation set. In this study, we address the problem of CP calibration when we only have access to a validation set with noisy labels. We show how we can estimate the noise-free conformal threshold based on the noisy labeled data. We derive a finite-sample coverage guarantee under uniform noise that remains effective even in classification tasks with a large number of classes. We dub our approach Noise-Aware Conformal Prediction (NACP). We illustrate the performance of the proposed results on several standard image classification datasets with a large number of classes.} }
Endnote
%0 Conference Paper %T Conformal Prediction of Classifiers with Many Classes based on Noisy Labels %A Coby Penso %A Jacob Goldberger %A Ethan Fetaya %B Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications %C Proceedings of Machine Learning Research %D 2025 %E Khuong An Nguyen %E Zhiyuan Luo %E Harris Papadopoulos %E Tuwe Löfström %E Lars Carlsson %E Henrik Boström %F pmlr-v266-penso25a %I PMLR %P 82--95 %U https://proceedings.mlr.press/v266/penso25a.html %V 266 %X Conformal Prediction (CP) is a method to control prediction uncertainty by producing a small prediction set, ensuring a predetermined probability that the true class lies within this set. This is commonly done by defining a score, based on the model predictions, and setting a threshold on this score using a validation set. In this study, we address the problem of CP calibration when we only have access to a validation set with noisy labels. We show how we can estimate the noise-free conformal threshold based on the noisy labeled data. We derive a finite-sample coverage guarantee under uniform noise that remains effective even in classification tasks with a large number of classes. We dub our approach Noise-Aware Conformal Prediction (NACP). We illustrate the performance of the proposed results on several standard image classification datasets with a large number of classes.
APA
Penso, C., Goldberger, J. & Fetaya, E.. (2025). Conformal Prediction of Classifiers with Many Classes based on Noisy Labels. Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications, in Proceedings of Machine Learning Research 266:82-95 Available from https://proceedings.mlr.press/v266/penso25a.html.

Related Material