Federated Conformal Predictors for Distributed Uncertainty Quantification

Charles Lu, Yaodong Yu, Sai Praneeth Karimireddy, Michael Jordan, Ramesh Raskar
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:22942-22964, 2023.

Abstract

Conformal prediction is emerging as a popular paradigm for providing rigorous uncertainty quantification in machine learning since it can be easily applied as a post-processing step to already trained models. In this paper, we extend conformal prediction to the federated learning setting. The main challenge we face is data heterogeneity across the clients — this violates the fundamental tenet of exchangeability required for conformal prediction. We propose a weaker notion of partial exchangeability, better suited to the FL setting, and use it to develop the Federated Conformal Prediction (FCP) framework. We show FCP enjoys rigorous theoretical guarantees and excellent empirical performance on several computer vision and medical imaging datasets. Our results demonstrate a practical approach to incorporating meaningful uncertainty quantification in distributed and heterogeneous environments. We provide code used in our experiments https://github.com/clu5/federated-conformal.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-lu23i, title = {Federated Conformal Predictors for Distributed Uncertainty Quantification}, author = {Lu, Charles and Yu, Yaodong and Karimireddy, Sai Praneeth and Jordan, Michael and Raskar, Ramesh}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {22942--22964}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/lu23i/lu23i.pdf}, url = {https://proceedings.mlr.press/v202/lu23i.html}, abstract = {Conformal prediction is emerging as a popular paradigm for providing rigorous uncertainty quantification in machine learning since it can be easily applied as a post-processing step to already trained models. In this paper, we extend conformal prediction to the federated learning setting. The main challenge we face is data heterogeneity across the clients — this violates the fundamental tenet of exchangeability required for conformal prediction. We propose a weaker notion of partial exchangeability, better suited to the FL setting, and use it to develop the Federated Conformal Prediction (FCP) framework. We show FCP enjoys rigorous theoretical guarantees and excellent empirical performance on several computer vision and medical imaging datasets. Our results demonstrate a practical approach to incorporating meaningful uncertainty quantification in distributed and heterogeneous environments. We provide code used in our experiments https://github.com/clu5/federated-conformal.} }
Endnote
%0 Conference Paper %T Federated Conformal Predictors for Distributed Uncertainty Quantification %A Charles Lu %A Yaodong Yu %A Sai Praneeth Karimireddy %A Michael Jordan %A Ramesh Raskar %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-lu23i %I PMLR %P 22942--22964 %U https://proceedings.mlr.press/v202/lu23i.html %V 202 %X Conformal prediction is emerging as a popular paradigm for providing rigorous uncertainty quantification in machine learning since it can be easily applied as a post-processing step to already trained models. In this paper, we extend conformal prediction to the federated learning setting. The main challenge we face is data heterogeneity across the clients — this violates the fundamental tenet of exchangeability required for conformal prediction. We propose a weaker notion of partial exchangeability, better suited to the FL setting, and use it to develop the Federated Conformal Prediction (FCP) framework. We show FCP enjoys rigorous theoretical guarantees and excellent empirical performance on several computer vision and medical imaging datasets. Our results demonstrate a practical approach to incorporating meaningful uncertainty quantification in distributed and heterogeneous environments. We provide code used in our experiments https://github.com/clu5/federated-conformal.
APA
Lu, C., Yu, Y., Karimireddy, S.P., Jordan, M. & Raskar, R.. (2023). Federated Conformal Predictors for Distributed Uncertainty Quantification. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:22942-22964 Available from https://proceedings.mlr.press/v202/lu23i.html.

Related Material