Two fundamental limits for uncertainty quantification in predictive inference

Felipe Areces, Chen Cheng, John Duchi, Kuditipudi Rohith
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:186-218, 2024.

Abstract

We study the statistical hardness of estimating two basic representations of uncertainty in predictive inference: prediction sets and calibration error. First, we show that conformal prediction sets cannot approach a desired weighted conformal coverage level—with respect to a family of binary witness functions with VC dimension $d$—at a minimax rate faster than $O(d^{1/2}n^{-1/2})$. We also show that the algorithm in Gibbs et al. (2023) achieves this rate and that extending our class of conformal sets beyond thresholds of non-conformity scores to include arbitrary convex sets of non-conformity scores only improves the minimax rate by a constant factor. Then, under a similar VC dimension constraint on the witness function class, we show it is not possible to estimate the weighted weak calibration error at a minimax rate faster than $O(d^{1/4}n^{-1/2})$. We show that the algorithm in Kumar et al. (2019) achieves this rate in the particular case of estimating the squared weak calibration error of a predictor that outputs $d$ distinct values.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-areces24a, title = {Two fundamental limits for uncertainty quantification in predictive inference}, author = {Areces, Felipe and Cheng, Chen and Duchi, John and Rohith, Kuditipudi}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {186--218}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/areces24a/areces24a.pdf}, url = {https://proceedings.mlr.press/v247/areces24a.html}, abstract = {We study the statistical hardness of estimating two basic representations of uncertainty in predictive inference: prediction sets and calibration error. First, we show that conformal prediction sets cannot approach a desired weighted conformal coverage level—with respect to a family of binary witness functions with VC dimension $d$—at a minimax rate faster than $O(d^{1/2}n^{-1/2})$. We also show that the algorithm in Gibbs et al. (2023) achieves this rate and that extending our class of conformal sets beyond thresholds of non-conformity scores to include arbitrary convex sets of non-conformity scores only improves the minimax rate by a constant factor. Then, under a similar VC dimension constraint on the witness function class, we show it is not possible to estimate the weighted weak calibration error at a minimax rate faster than $O(d^{1/4}n^{-1/2})$. We show that the algorithm in Kumar et al. (2019) achieves this rate in the particular case of estimating the squared weak calibration error of a predictor that outputs $d$ distinct values.} }
Endnote
%0 Conference Paper %T Two fundamental limits for uncertainty quantification in predictive inference %A Felipe Areces %A Chen Cheng %A John Duchi %A Kuditipudi Rohith %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-areces24a %I PMLR %P 186--218 %U https://proceedings.mlr.press/v247/areces24a.html %V 247 %X We study the statistical hardness of estimating two basic representations of uncertainty in predictive inference: prediction sets and calibration error. First, we show that conformal prediction sets cannot approach a desired weighted conformal coverage level—with respect to a family of binary witness functions with VC dimension $d$—at a minimax rate faster than $O(d^{1/2}n^{-1/2})$. We also show that the algorithm in Gibbs et al. (2023) achieves this rate and that extending our class of conformal sets beyond thresholds of non-conformity scores to include arbitrary convex sets of non-conformity scores only improves the minimax rate by a constant factor. Then, under a similar VC dimension constraint on the witness function class, we show it is not possible to estimate the weighted weak calibration error at a minimax rate faster than $O(d^{1/4}n^{-1/2})$. We show that the algorithm in Kumar et al. (2019) achieves this rate in the particular case of estimating the squared weak calibration error of a predictor that outputs $d$ distinct values.
APA
Areces, F., Cheng, C., Duchi, J. & Rohith, K.. (2024). Two fundamental limits for uncertainty quantification in predictive inference. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:186-218 Available from https://proceedings.mlr.press/v247/areces24a.html.

Related Material