Aleatoric and Epistemic Uncertainty in Conformal Prediction

Yusuf Sale, Alireza Javanmardi, Eyke Hüllermeier
Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications, PMLR 266:784-786, 2025.

Abstract

Recently, there has been a particular interest in distinguishing different types of uncertainty in supervised machine learning (ML) settings (Hullermeier and Waegeman, 2021). Aleatoric uncertainty captures the inherent randomness in the data-generating process. As it represents variability that cannot be reduced even with more data, it is often referred to as irreducible uncertainty. In contrast, epistemic uncertainty arises from a lack of knowledge about the underlying data-generating process, which–in principle–can be reduced by acquiring additional data or improving the model itself (viz. reducible uncertainty). In parallel, interest in conformal prediction (CP)–both its theory and applications–has become equally vigorous. Conformal Prediction (Vovk et al., 2005) is a model-agnostic framework for uncertainty quantification that provides prediction sets or intervals with rigorous statistical coverage guarantees. Notably, CP is distribution-free and makes only the mild assumption of exchangeability. Under this assumption, it yields prediction intervals that contain the true label with a user-specified probability. Thus, CP is seen as a promising tool to quantify uncertainty. But how is it related to aleatoric and epistemic uncertainty? In particular, we first analyze how (estimates of) aleatoric and epistemic uncertainty enter into the construction of vanilla CP–that is, how noise and model error jointly shape the global threshold. We then review “uncertainty-aware” extensions that integrate these uncertainty estimates into the CP pipeline.

Cite this Paper


BibTeX
@InProceedings{pmlr-v266-sale25a, title = {Aleatoric and Epistemic Uncertainty in Conformal Prediction}, author = {Sale, Yusuf and Javanmardi, Alireza and H\"{u}llermeier, Eyke}, booktitle = {Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications}, pages = {784--786}, year = {2025}, editor = {Nguyen, Khuong An and Luo, Zhiyuan and Papadopoulos, Harris and Löfström, Tuwe and Carlsson, Lars and Boström, Henrik}, volume = {266}, series = {Proceedings of Machine Learning Research}, month = {10--12 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v266/main/assets/sale25a/sale25a.pdf}, url = {https://proceedings.mlr.press/v266/sale25a.html}, abstract = {Recently, there has been a particular interest in distinguishing different types of uncertainty in supervised machine learning (ML) settings (Hullermeier and Waegeman, 2021). Aleatoric uncertainty captures the inherent randomness in the data-generating process. As it represents variability that cannot be reduced even with more data, it is often referred to as irreducible uncertainty. In contrast, epistemic uncertainty arises from a lack of knowledge about the underlying data-generating process, which–in principle–can be reduced by acquiring additional data or improving the model itself (viz. reducible uncertainty). In parallel, interest in conformal prediction (CP)–both its theory and applications–has become equally vigorous. Conformal Prediction (Vovk et al., 2005) is a model-agnostic framework for uncertainty quantification that provides prediction sets or intervals with rigorous statistical coverage guarantees. Notably, CP is distribution-free and makes only the mild assumption of exchangeability. Under this assumption, it yields prediction intervals that contain the true label with a user-specified probability. Thus, CP is seen as a promising tool to quantify uncertainty. But how is it related to aleatoric and epistemic uncertainty? In particular, we first analyze how (estimates of) aleatoric and epistemic uncertainty enter into the construction of vanilla CP–that is, how noise and model error jointly shape the global threshold. We then review “uncertainty-aware” extensions that integrate these uncertainty estimates into the CP pipeline.} }
Endnote
%0 Conference Paper %T Aleatoric and Epistemic Uncertainty in Conformal Prediction %A Yusuf Sale %A Alireza Javanmardi %A Eyke Hüllermeier %B Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications %C Proceedings of Machine Learning Research %D 2025 %E Khuong An Nguyen %E Zhiyuan Luo %E Harris Papadopoulos %E Tuwe Löfström %E Lars Carlsson %E Henrik Boström %F pmlr-v266-sale25a %I PMLR %P 784--786 %U https://proceedings.mlr.press/v266/sale25a.html %V 266 %X Recently, there has been a particular interest in distinguishing different types of uncertainty in supervised machine learning (ML) settings (Hullermeier and Waegeman, 2021). Aleatoric uncertainty captures the inherent randomness in the data-generating process. As it represents variability that cannot be reduced even with more data, it is often referred to as irreducible uncertainty. In contrast, epistemic uncertainty arises from a lack of knowledge about the underlying data-generating process, which–in principle–can be reduced by acquiring additional data or improving the model itself (viz. reducible uncertainty). In parallel, interest in conformal prediction (CP)–both its theory and applications–has become equally vigorous. Conformal Prediction (Vovk et al., 2005) is a model-agnostic framework for uncertainty quantification that provides prediction sets or intervals with rigorous statistical coverage guarantees. Notably, CP is distribution-free and makes only the mild assumption of exchangeability. Under this assumption, it yields prediction intervals that contain the true label with a user-specified probability. Thus, CP is seen as a promising tool to quantify uncertainty. But how is it related to aleatoric and epistemic uncertainty? In particular, we first analyze how (estimates of) aleatoric and epistemic uncertainty enter into the construction of vanilla CP–that is, how noise and model error jointly shape the global threshold. We then review “uncertainty-aware” extensions that integrate these uncertainty estimates into the CP pipeline.
APA
Sale, Y., Javanmardi, A. & Hüllermeier, E.. (2025). Aleatoric and Epistemic Uncertainty in Conformal Prediction. Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications, in Proceedings of Machine Learning Research 266:784-786 Available from https://proceedings.mlr.press/v266/sale25a.html.

Related Material