[edit]
Flexible and Systematic Uncertainty Estimation with Conformal Prediction via the MAPIE library
Proceedings of the Twelfth Symposium on Conformal
and Probabilistic Prediction with Applications, PMLR 204:549-581, 2023.
Abstract
Conformal prediction (CP) is an attractive
theoretical framework for estimating the
uncertainties of any predictive algorithms as its
methodology is general and systematic with few
assumptions. CP methods can be abstracted into
building blocks that can be deployed on any type of
data, model, or task. In this work, we contribute to
the wide diffusion of the CP framework by developing
the library MAPIE1 that implements such principles
and can address seamlessly different tasks
(e.g. classification, regression, time-series) and
in different settings (split and
cross-conformal). All these concepts are under a
common umbrella with an emphasis on readability,
transparency, and reliability, hence supporting the
principles of trustworthy AI. An original feature of
MAPIE is to offer the possibility of designing
tailored-made non-conformity scores in particular
p-normalized residual non-conformal scores that can
be defined to account for asymmetric errors. We show
theoretically the marginal coverage guarantee in
several settings. We highlight through applications
the interest of choosing different non-conformity
scores for tabular data when considering local
coverage.