Approximation to object conditional validity with inductive conformal predictors

Anthony Bellotti
Proceedings of the Tenth Symposium on Conformal and Probabilistic Prediction and Applications, PMLR 152:4-23, 2021.

Abstract

Conformal predictors are machine learning algorithms that output prediction sets that have a guarantee of marginal validity for finite samples with minimal distributional assumptions. This is a property that makes conformal predictors useful for machine learning tasks where we require reliable predictions. It would also be desirable to achieve conditional validity in the same setting, in the sense that validity of the prediction intervals remains true regardless of conditioning on any particular property of the object of the prediction. Unfortunately, it has been shown that such conditional validity is impossible to guarantee for non-trivial prediction problems for finite samples. In this article, instead of trying to achieve a strong conditional validity guarantee, an \emph{approximation} to conditional validity is considered and measured empirically. A new algorithm is introduced to do this by iteratively adjusting a conformity measure to deviations from object conditional validity measured in the training data. Experimental results are provided for three data sets that demonstrate (1) in real world machine learning tasks, lack of conditional validity is a measurable problem and (2) that the proposed algorithm is effective at alleviating this problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v152-bellotti21a, title = {Approximation to object conditional validity with inductive conformal predictors}, author = {Bellotti, Anthony}, booktitle = {Proceedings of the Tenth Symposium on Conformal and Probabilistic Prediction and Applications}, pages = {4--23}, year = {2021}, editor = {Carlsson, Lars and Luo, Zhiyuan and Cherubin, Giovanni and An Nguyen, Khuong}, volume = {152}, series = {Proceedings of Machine Learning Research}, month = {08--10 Sep}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v152/bellotti21a/bellotti21a.pdf}, url = {https://proceedings.mlr.press/v152/bellotti21a.html}, abstract = {Conformal predictors are machine learning algorithms that output prediction sets that have a guarantee of marginal validity for finite samples with minimal distributional assumptions. This is a property that makes conformal predictors useful for machine learning tasks where we require reliable predictions. It would also be desirable to achieve conditional validity in the same setting, in the sense that validity of the prediction intervals remains true regardless of conditioning on any particular property of the object of the prediction. Unfortunately, it has been shown that such conditional validity is impossible to guarantee for non-trivial prediction problems for finite samples. In this article, instead of trying to achieve a strong conditional validity guarantee, an \emph{approximation} to conditional validity is considered and measured empirically. A new algorithm is introduced to do this by iteratively adjusting a conformity measure to deviations from object conditional validity measured in the training data. Experimental results are provided for three data sets that demonstrate (1) in real world machine learning tasks, lack of conditional validity is a measurable problem and (2) that the proposed algorithm is effective at alleviating this problem.} }
Endnote
%0 Conference Paper %T Approximation to object conditional validity with inductive conformal predictors %A Anthony Bellotti %B Proceedings of the Tenth Symposium on Conformal and Probabilistic Prediction and Applications %C Proceedings of Machine Learning Research %D 2021 %E Lars Carlsson %E Zhiyuan Luo %E Giovanni Cherubin %E Khuong An Nguyen %F pmlr-v152-bellotti21a %I PMLR %P 4--23 %U https://proceedings.mlr.press/v152/bellotti21a.html %V 152 %X Conformal predictors are machine learning algorithms that output prediction sets that have a guarantee of marginal validity for finite samples with minimal distributional assumptions. This is a property that makes conformal predictors useful for machine learning tasks where we require reliable predictions. It would also be desirable to achieve conditional validity in the same setting, in the sense that validity of the prediction intervals remains true regardless of conditioning on any particular property of the object of the prediction. Unfortunately, it has been shown that such conditional validity is impossible to guarantee for non-trivial prediction problems for finite samples. In this article, instead of trying to achieve a strong conditional validity guarantee, an \emph{approximation} to conditional validity is considered and measured empirically. A new algorithm is introduced to do this by iteratively adjusting a conformity measure to deviations from object conditional validity measured in the training data. Experimental results are provided for three data sets that demonstrate (1) in real world machine learning tasks, lack of conditional validity is a measurable problem and (2) that the proposed algorithm is effective at alleviating this problem.
APA
Bellotti, A.. (2021). Approximation to object conditional validity with inductive conformal predictors. Proceedings of the Tenth Symposium on Conformal and Probabilistic Prediction and Applications, in Proceedings of Machine Learning Research 152:4-23 Available from https://proceedings.mlr.press/v152/bellotti21a.html.

Related Material