Rectifying Conformity Scores for Better Conditional Coverage

Vincent Plassier, Alexander Fishkov, Victor Dheur, Mohsen Guizani, Souhaib Ben Taieb, Maxim Panov, Eric Moulines
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:49459-49492, 2025.

Abstract

We present a new method for generating confidence sets within the split conformal prediction framework. Our method performs a trainable transformation of any given conformity score to improve conditional coverage while ensuring exact marginal coverage. The transformation is based on an estimate of the conditional quantile of conformity scores. The resulting method is particularly beneficial for constructing adaptive confidence sets in multi-output problems where standard conformal quantile regression approaches have limited applicability. We develop a theoretical bound that captures the influence of the accuracy of the quantile estimate on the approximate conditional validity, unlike classical bounds for conformal prediction methods that only offer marginal coverage. We experimentally show that our method is highly adaptive to the local data structure and outperforms existing methods in terms of conditional coverage, improving the reliability of statistical inference in various applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-plassier25a, title = {Rectifying Conformity Scores for Better Conditional Coverage}, author = {Plassier, Vincent and Fishkov, Alexander and Dheur, Victor and Guizani, Mohsen and Ben Taieb, Souhaib and Panov, Maxim and Moulines, Eric}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {49459--49492}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/plassier25a/plassier25a.pdf}, url = {https://proceedings.mlr.press/v267/plassier25a.html}, abstract = {We present a new method for generating confidence sets within the split conformal prediction framework. Our method performs a trainable transformation of any given conformity score to improve conditional coverage while ensuring exact marginal coverage. The transformation is based on an estimate of the conditional quantile of conformity scores. The resulting method is particularly beneficial for constructing adaptive confidence sets in multi-output problems where standard conformal quantile regression approaches have limited applicability. We develop a theoretical bound that captures the influence of the accuracy of the quantile estimate on the approximate conditional validity, unlike classical bounds for conformal prediction methods that only offer marginal coverage. We experimentally show that our method is highly adaptive to the local data structure and outperforms existing methods in terms of conditional coverage, improving the reliability of statistical inference in various applications.} }
Endnote
%0 Conference Paper %T Rectifying Conformity Scores for Better Conditional Coverage %A Vincent Plassier %A Alexander Fishkov %A Victor Dheur %A Mohsen Guizani %A Souhaib Ben Taieb %A Maxim Panov %A Eric Moulines %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-plassier25a %I PMLR %P 49459--49492 %U https://proceedings.mlr.press/v267/plassier25a.html %V 267 %X We present a new method for generating confidence sets within the split conformal prediction framework. Our method performs a trainable transformation of any given conformity score to improve conditional coverage while ensuring exact marginal coverage. The transformation is based on an estimate of the conditional quantile of conformity scores. The resulting method is particularly beneficial for constructing adaptive confidence sets in multi-output problems where standard conformal quantile regression approaches have limited applicability. We develop a theoretical bound that captures the influence of the accuracy of the quantile estimate on the approximate conditional validity, unlike classical bounds for conformal prediction methods that only offer marginal coverage. We experimentally show that our method is highly adaptive to the local data structure and outperforms existing methods in terms of conditional coverage, improving the reliability of statistical inference in various applications.
APA
Plassier, V., Fishkov, A., Dheur, V., Guizani, M., Ben Taieb, S., Panov, M. & Moulines, E.. (2025). Rectifying Conformity Scores for Better Conditional Coverage. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:49459-49492 Available from https://proceedings.mlr.press/v267/plassier25a.html.

Related Material