Personalizing Low-Rank Bayesian Neural Networks Via Federated Learning

Boning Zhang, Dongzhu Liu, Osvaldo Simeone, Guanchu Wang, Dimitrios Pezaros, Guangxu Zhu
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:5041-5049, 2025.

Abstract

To support real-world decision-making, it is crucial for models to be well-calibrated, i.e., to assign reliable confidence estimates to their predictions. Uncertainty quantification is particularly important in personalized federated learning (PFL), as participating clients typically have small local datasets, making it difficult to unambiguously determine optimal model parameters. Bayesian PFL (BPFL) methods can potentially enhance calibration, but they often come with considerable computational and memory requirements due to the need to track the variances of all the individual model parameters. Furthermore, different clients may exhibit heterogeneous uncertainty levels owing to varying local dataset sizes and distributions. To address these challenges, we propose LR-BPFL, a novel BPFL method that learns a global deterministic model along with personalized low-rank Bayesian corrections. To tailor the local model to each client’s inherent uncertainty level, LR-BPFL incorporates an adaptive rank selection mechanism. We evaluate LR-BPFL across a variety of datasets, demonstrating its advantages in terms of calibration, accuracy, as well as computational and memory requirements. The code is available at \url{https://github.com/Bernie0115/LR-BPFL.}

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-zhang25l, title = {Personalizing Low-Rank Bayesian Neural Networks Via Federated Learning}, author = {Zhang, Boning and Liu, Dongzhu and Simeone, Osvaldo and Wang, Guanchu and Pezaros, Dimitrios and Zhu, Guangxu}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {5041--5049}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/zhang25l/zhang25l.pdf}, url = {https://proceedings.mlr.press/v258/zhang25l.html}, abstract = {To support real-world decision-making, it is crucial for models to be well-calibrated, i.e., to assign reliable confidence estimates to their predictions. Uncertainty quantification is particularly important in personalized federated learning (PFL), as participating clients typically have small local datasets, making it difficult to unambiguously determine optimal model parameters. Bayesian PFL (BPFL) methods can potentially enhance calibration, but they often come with considerable computational and memory requirements due to the need to track the variances of all the individual model parameters. Furthermore, different clients may exhibit heterogeneous uncertainty levels owing to varying local dataset sizes and distributions. To address these challenges, we propose LR-BPFL, a novel BPFL method that learns a global deterministic model along with personalized low-rank Bayesian corrections. To tailor the local model to each client’s inherent uncertainty level, LR-BPFL incorporates an adaptive rank selection mechanism. We evaluate LR-BPFL across a variety of datasets, demonstrating its advantages in terms of calibration, accuracy, as well as computational and memory requirements. The code is available at \url{https://github.com/Bernie0115/LR-BPFL.}} }
Endnote
%0 Conference Paper %T Personalizing Low-Rank Bayesian Neural Networks Via Federated Learning %A Boning Zhang %A Dongzhu Liu %A Osvaldo Simeone %A Guanchu Wang %A Dimitrios Pezaros %A Guangxu Zhu %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-zhang25l %I PMLR %P 5041--5049 %U https://proceedings.mlr.press/v258/zhang25l.html %V 258 %X To support real-world decision-making, it is crucial for models to be well-calibrated, i.e., to assign reliable confidence estimates to their predictions. Uncertainty quantification is particularly important in personalized federated learning (PFL), as participating clients typically have small local datasets, making it difficult to unambiguously determine optimal model parameters. Bayesian PFL (BPFL) methods can potentially enhance calibration, but they often come with considerable computational and memory requirements due to the need to track the variances of all the individual model parameters. Furthermore, different clients may exhibit heterogeneous uncertainty levels owing to varying local dataset sizes and distributions. To address these challenges, we propose LR-BPFL, a novel BPFL method that learns a global deterministic model along with personalized low-rank Bayesian corrections. To tailor the local model to each client’s inherent uncertainty level, LR-BPFL incorporates an adaptive rank selection mechanism. We evaluate LR-BPFL across a variety of datasets, demonstrating its advantages in terms of calibration, accuracy, as well as computational and memory requirements. The code is available at \url{https://github.com/Bernie0115/LR-BPFL.}
APA
Zhang, B., Liu, D., Simeone, O., Wang, G., Pezaros, D. & Zhu, G.. (2025). Personalizing Low-Rank Bayesian Neural Networks Via Federated Learning. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:5041-5049 Available from https://proceedings.mlr.press/v258/zhang25l.html.

Related Material