FedCal: Achieving Local and Global Calibration in Federated Learning via Aggregated Parameterized Scaler

Hongyi Peng, Han Yu, Xiaoli Tang, Xiaoxiao Li
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:40331-40346, 2024.

Abstract

Federated learning (FL) enables collaborative machine learning across distributed data owners, but data heterogeneity poses a challenge for model calibration. While prior work focused on improving accuracy for non-iid data, calibration remains under-explored. This study reveals existing FL aggregation approaches lead to sub-optimal calibration, and theoretical analysis shows despite constraining variance in clients’ label distributions, global calibration error is still asymptotically lower bounded. To address this, we propose a novel Federated Calibration (FedCal) approach, emphasizing both local and global calibration. It leverages client-specific scalers for local calibration to effectively correct output misalignment without sacrificing prediction accuracy. These scalers are then aggregated via weight averaging to generate a global scaler, minimizing the global calibration error. Extensive experiments demonstrate that FedCal significantly outperforms the best-performing baseline, reducing global calibration error by 47.66% on average.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-peng24g, title = {{F}ed{C}al: Achieving Local and Global Calibration in Federated Learning via Aggregated Parameterized Scaler}, author = {Peng, Hongyi and Yu, Han and Tang, Xiaoli and Li, Xiaoxiao}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {40331--40346}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/peng24g/peng24g.pdf}, url = {https://proceedings.mlr.press/v235/peng24g.html}, abstract = {Federated learning (FL) enables collaborative machine learning across distributed data owners, but data heterogeneity poses a challenge for model calibration. While prior work focused on improving accuracy for non-iid data, calibration remains under-explored. This study reveals existing FL aggregation approaches lead to sub-optimal calibration, and theoretical analysis shows despite constraining variance in clients’ label distributions, global calibration error is still asymptotically lower bounded. To address this, we propose a novel Federated Calibration (FedCal) approach, emphasizing both local and global calibration. It leverages client-specific scalers for local calibration to effectively correct output misalignment without sacrificing prediction accuracy. These scalers are then aggregated via weight averaging to generate a global scaler, minimizing the global calibration error. Extensive experiments demonstrate that FedCal significantly outperforms the best-performing baseline, reducing global calibration error by 47.66% on average.} }
Endnote
%0 Conference Paper %T FedCal: Achieving Local and Global Calibration in Federated Learning via Aggregated Parameterized Scaler %A Hongyi Peng %A Han Yu %A Xiaoli Tang %A Xiaoxiao Li %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-peng24g %I PMLR %P 40331--40346 %U https://proceedings.mlr.press/v235/peng24g.html %V 235 %X Federated learning (FL) enables collaborative machine learning across distributed data owners, but data heterogeneity poses a challenge for model calibration. While prior work focused on improving accuracy for non-iid data, calibration remains under-explored. This study reveals existing FL aggregation approaches lead to sub-optimal calibration, and theoretical analysis shows despite constraining variance in clients’ label distributions, global calibration error is still asymptotically lower bounded. To address this, we propose a novel Federated Calibration (FedCal) approach, emphasizing both local and global calibration. It leverages client-specific scalers for local calibration to effectively correct output misalignment without sacrificing prediction accuracy. These scalers are then aggregated via weight averaging to generate a global scaler, minimizing the global calibration error. Extensive experiments demonstrate that FedCal significantly outperforms the best-performing baseline, reducing global calibration error by 47.66% on average.
APA
Peng, H., Yu, H., Tang, X. & Li, X.. (2024). FedCal: Achieving Local and Global Calibration in Federated Learning via Aggregated Parameterized Scaler. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:40331-40346 Available from https://proceedings.mlr.press/v235/peng24g.html.

Related Material