Federated Generalised Variational Inference: A Robust Probabilistic Federated Learning Framework

Terje Mildner, Oliver Hamelijnck, Paris Giampouras, Theodoros Damoulas
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:44134-44174, 2025.

Abstract

We introduce FedGVI, a probabilistic Federated Learning (FL) framework that is robust to both prior and likelihood misspecification. FedGVI addresses limitations in both frequentist and Bayesian FL by providing unbiased predictions under model misspecification, with calibrated uncertainty quantification. Our approach generalises previous FL approaches, specifically Partitioned Variational Inference (Ashman et al., 2022), by allowing robust and conjugate updates, decreasing computational complexity at the clients. We offer theoretical analysis in terms of fixed-point convergence, optimality of the cavity distribution, and provable robustness to likelihood misspecification. Further, we empirically demonstrate the effectiveness of FedGVI in terms of improved robustness and predictive performance on multiple synthetic and real world classification data sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-mildner25a, title = {Federated Generalised Variational Inference: A Robust Probabilistic Federated Learning Framework}, author = {Mildner, Terje and Hamelijnck, Oliver and Giampouras, Paris and Damoulas, Theodoros}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {44134--44174}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/mildner25a/mildner25a.pdf}, url = {https://proceedings.mlr.press/v267/mildner25a.html}, abstract = {We introduce FedGVI, a probabilistic Federated Learning (FL) framework that is robust to both prior and likelihood misspecification. FedGVI addresses limitations in both frequentist and Bayesian FL by providing unbiased predictions under model misspecification, with calibrated uncertainty quantification. Our approach generalises previous FL approaches, specifically Partitioned Variational Inference (Ashman et al., 2022), by allowing robust and conjugate updates, decreasing computational complexity at the clients. We offer theoretical analysis in terms of fixed-point convergence, optimality of the cavity distribution, and provable robustness to likelihood misspecification. Further, we empirically demonstrate the effectiveness of FedGVI in terms of improved robustness and predictive performance on multiple synthetic and real world classification data sets.} }
Endnote
%0 Conference Paper %T Federated Generalised Variational Inference: A Robust Probabilistic Federated Learning Framework %A Terje Mildner %A Oliver Hamelijnck %A Paris Giampouras %A Theodoros Damoulas %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-mildner25a %I PMLR %P 44134--44174 %U https://proceedings.mlr.press/v267/mildner25a.html %V 267 %X We introduce FedGVI, a probabilistic Federated Learning (FL) framework that is robust to both prior and likelihood misspecification. FedGVI addresses limitations in both frequentist and Bayesian FL by providing unbiased predictions under model misspecification, with calibrated uncertainty quantification. Our approach generalises previous FL approaches, specifically Partitioned Variational Inference (Ashman et al., 2022), by allowing robust and conjugate updates, decreasing computational complexity at the clients. We offer theoretical analysis in terms of fixed-point convergence, optimality of the cavity distribution, and provable robustness to likelihood misspecification. Further, we empirically demonstrate the effectiveness of FedGVI in terms of improved robustness and predictive performance on multiple synthetic and real world classification data sets.
APA
Mildner, T., Hamelijnck, O., Giampouras, P. & Damoulas, T.. (2025). Federated Generalised Variational Inference: A Robust Probabilistic Federated Learning Framework. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:44134-44174 Available from https://proceedings.mlr.press/v267/mildner25a.html.

Related Material