Flexible and Systematic Uncertainty Estimation with Conformal Prediction via the MAPIE library

Thibault Cordier, Vincent Blot, Louis Lacombe, Thomas Morzadec, Arnaud Capitaine, Nicolas Brunel
Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications, PMLR 204:549-581, 2023.

Abstract

Conformal prediction (CP) is an attractive theoretical framework for estimating the uncertainties of any predictive algorithms as its methodology is general and systematic with few assumptions. CP methods can be abstracted into building blocks that can be deployed on any type of data, model, or task. In this work, we contribute to the wide diffusion of the CP framework by developing the library MAPIE1 that implements such principles and can address seamlessly different tasks (e.g. classification, regression, time-series) and in different settings (split and cross-conformal). All these concepts are under a common umbrella with an emphasis on readability, transparency, and reliability, hence supporting the principles of trustworthy AI. An original feature of MAPIE is to offer the possibility of designing tailored-made non-conformity scores in particular p-normalized residual non-conformal scores that can be defined to account for asymmetric errors. We show theoretically the marginal coverage guarantee in several settings. We highlight through applications the interest of choosing different non-conformity scores for tabular data when considering local coverage.

Cite this Paper


BibTeX
@InProceedings{pmlr-v204-cordier23a, title = {Flexible and Systematic Uncertainty Estimation with Conformal Prediction via the MAPIE library}, author = {Cordier, Thibault and Blot, Vincent and Lacombe, Louis and Morzadec, Thomas and Capitaine, Arnaud and Brunel, Nicolas}, booktitle = {Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications}, pages = {549--581}, year = {2023}, editor = {Papadopoulos, Harris and Nguyen, Khuong An and Boström, Henrik and Carlsson, Lars}, volume = {204}, series = {Proceedings of Machine Learning Research}, month = {13--15 Sep}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v204/cordier23a/cordier23a.pdf}, url = {https://proceedings.mlr.press/v204/cordier23a.html}, abstract = {Conformal prediction (CP) is an attractive theoretical framework for estimating the uncertainties of any predictive algorithms as its methodology is general and systematic with few assumptions. CP methods can be abstracted into building blocks that can be deployed on any type of data, model, or task. In this work, we contribute to the wide diffusion of the CP framework by developing the library MAPIE1 that implements such principles and can address seamlessly different tasks (e.g. classification, regression, time-series) and in different settings (split and cross-conformal). All these concepts are under a common umbrella with an emphasis on readability, transparency, and reliability, hence supporting the principles of trustworthy AI. An original feature of MAPIE is to offer the possibility of designing tailored-made non-conformity scores in particular p-normalized residual non-conformal scores that can be defined to account for asymmetric errors. We show theoretically the marginal coverage guarantee in several settings. We highlight through applications the interest of choosing different non-conformity scores for tabular data when considering local coverage.} }
Endnote
%0 Conference Paper %T Flexible and Systematic Uncertainty Estimation with Conformal Prediction via the MAPIE library %A Thibault Cordier %A Vincent Blot %A Louis Lacombe %A Thomas Morzadec %A Arnaud Capitaine %A Nicolas Brunel %B Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications %C Proceedings of Machine Learning Research %D 2023 %E Harris Papadopoulos %E Khuong An Nguyen %E Henrik Boström %E Lars Carlsson %F pmlr-v204-cordier23a %I PMLR %P 549--581 %U https://proceedings.mlr.press/v204/cordier23a.html %V 204 %X Conformal prediction (CP) is an attractive theoretical framework for estimating the uncertainties of any predictive algorithms as its methodology is general and systematic with few assumptions. CP methods can be abstracted into building blocks that can be deployed on any type of data, model, or task. In this work, we contribute to the wide diffusion of the CP framework by developing the library MAPIE1 that implements such principles and can address seamlessly different tasks (e.g. classification, regression, time-series) and in different settings (split and cross-conformal). All these concepts are under a common umbrella with an emphasis on readability, transparency, and reliability, hence supporting the principles of trustworthy AI. An original feature of MAPIE is to offer the possibility of designing tailored-made non-conformity scores in particular p-normalized residual non-conformal scores that can be defined to account for asymmetric errors. We show theoretically the marginal coverage guarantee in several settings. We highlight through applications the interest of choosing different non-conformity scores for tabular data when considering local coverage.
APA
Cordier, T., Blot, V., Lacombe, L., Morzadec, T., Capitaine, A. & Brunel, N.. (2023). Flexible and Systematic Uncertainty Estimation with Conformal Prediction via the MAPIE library. Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications, in Proceedings of Machine Learning Research 204:549-581 Available from https://proceedings.mlr.press/v204/cordier23a.html.

Related Material