Improving Adaptive Conformal Prediction Using Self-Supervised Learning

Nabeel Seedat, Alan Jeffares, Fergus Imrie, Mihaela van der Schaar
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:10160-10177, 2023.

Abstract

Conformal prediction is a powerful distribution-free tool for uncertainty quantification, establishing valid prediction intervals with finite-sample guarantees. To produce valid intervals which are also adaptive to the difficulty of each instance, a common approach is to compute normalized nonconformity scores on a separate calibration set. Self-supervised learning has been effectively utilized in many domains to learn general representations for downstream predictors. However, the use of self-supervision beyond model pretraining and representation learning has been largely unexplored. In this work, we investigate how self-supervised pretext tasks can improve the quality of the conformal regressors, specifically by improving the adaptability of conformal intervals. We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores. We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-seedat23a, title = {Improving Adaptive Conformal Prediction Using Self-Supervised Learning}, author = {Seedat, Nabeel and Jeffares, Alan and Imrie, Fergus and van der Schaar, Mihaela}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {10160--10177}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/seedat23a/seedat23a.pdf}, url = {https://proceedings.mlr.press/v206/seedat23a.html}, abstract = {Conformal prediction is a powerful distribution-free tool for uncertainty quantification, establishing valid prediction intervals with finite-sample guarantees. To produce valid intervals which are also adaptive to the difficulty of each instance, a common approach is to compute normalized nonconformity scores on a separate calibration set. Self-supervised learning has been effectively utilized in many domains to learn general representations for downstream predictors. However, the use of self-supervision beyond model pretraining and representation learning has been largely unexplored. In this work, we investigate how self-supervised pretext tasks can improve the quality of the conformal regressors, specifically by improving the adaptability of conformal intervals. We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores. We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.} }
Endnote
%0 Conference Paper %T Improving Adaptive Conformal Prediction Using Self-Supervised Learning %A Nabeel Seedat %A Alan Jeffares %A Fergus Imrie %A Mihaela van der Schaar %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-seedat23a %I PMLR %P 10160--10177 %U https://proceedings.mlr.press/v206/seedat23a.html %V 206 %X Conformal prediction is a powerful distribution-free tool for uncertainty quantification, establishing valid prediction intervals with finite-sample guarantees. To produce valid intervals which are also adaptive to the difficulty of each instance, a common approach is to compute normalized nonconformity scores on a separate calibration set. Self-supervised learning has been effectively utilized in many domains to learn general representations for downstream predictors. However, the use of self-supervision beyond model pretraining and representation learning has been largely unexplored. In this work, we investigate how self-supervised pretext tasks can improve the quality of the conformal regressors, specifically by improving the adaptability of conformal intervals. We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores. We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.
APA
Seedat, N., Jeffares, A., Imrie, F. & van der Schaar, M.. (2023). Improving Adaptive Conformal Prediction Using Self-Supervised Learning. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:10160-10177 Available from https://proceedings.mlr.press/v206/seedat23a.html.

Related Material