FedSPD: A Soft-clustering Approach for Personalized Decentralized Federated Learning

I-Cheng Lin, Osman Yagan, Carlee Joe-Wong
Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, PMLR 286:2618-2641, 2025.

Abstract

Federated learning has recently gained popularity as a framework for distributed clients to collaboratively train a machine learning model using local data. While traditional federated learning relies on a central server for model aggregation, recent advancements adopt a decentralized framework, enabling direct model exchange between clients and eliminating the single point of failure. However, existing decentralized frameworks often assume all clients train a shared model. Personalizing each client’s model can enhance performance, especially with heterogeneous client data distributions. We propose FedSPD, an efficient personalized federated learning algorithm for the decentralized setting, and show that it learns accurate models in low-connectivity networks. To provide theoretical guarantees on convergence, we introduce a clustering-based framework that enables consensus on models for distinct data clusters while personalizing to unique mixtures of these clusters at different clients. This flexibility, allowing selective model updates based on data distribution, substantially reduces communication costs compared to prior work on personalized federated learning in decentralized settings. Experimental results on real-world datasets show that FedSPD outperforms multiple decentralized variants of existing personalized federated learning algorithms in scenarios with low-connectivity networks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v286-lin25a, title = {FedSPD: A Soft-clustering Approach for Personalized Decentralized Federated Learning}, author = {Lin, I-Cheng and Yagan, Osman and Joe-Wong, Carlee}, booktitle = {Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence}, pages = {2618--2641}, year = {2025}, editor = {Chiappa, Silvia and Magliacane, Sara}, volume = {286}, series = {Proceedings of Machine Learning Research}, month = {21--25 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v286/main/assets/lin25a/lin25a.pdf}, url = {https://proceedings.mlr.press/v286/lin25a.html}, abstract = {Federated learning has recently gained popularity as a framework for distributed clients to collaboratively train a machine learning model using local data. While traditional federated learning relies on a central server for model aggregation, recent advancements adopt a decentralized framework, enabling direct model exchange between clients and eliminating the single point of failure. However, existing decentralized frameworks often assume all clients train a shared model. Personalizing each client’s model can enhance performance, especially with heterogeneous client data distributions. We propose FedSPD, an efficient personalized federated learning algorithm for the decentralized setting, and show that it learns accurate models in low-connectivity networks. To provide theoretical guarantees on convergence, we introduce a clustering-based framework that enables consensus on models for distinct data clusters while personalizing to unique mixtures of these clusters at different clients. This flexibility, allowing selective model updates based on data distribution, substantially reduces communication costs compared to prior work on personalized federated learning in decentralized settings. Experimental results on real-world datasets show that FedSPD outperforms multiple decentralized variants of existing personalized federated learning algorithms in scenarios with low-connectivity networks.} }
Endnote
%0 Conference Paper %T FedSPD: A Soft-clustering Approach for Personalized Decentralized Federated Learning %A I-Cheng Lin %A Osman Yagan %A Carlee Joe-Wong %B Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2025 %E Silvia Chiappa %E Sara Magliacane %F pmlr-v286-lin25a %I PMLR %P 2618--2641 %U https://proceedings.mlr.press/v286/lin25a.html %V 286 %X Federated learning has recently gained popularity as a framework for distributed clients to collaboratively train a machine learning model using local data. While traditional federated learning relies on a central server for model aggregation, recent advancements adopt a decentralized framework, enabling direct model exchange between clients and eliminating the single point of failure. However, existing decentralized frameworks often assume all clients train a shared model. Personalizing each client’s model can enhance performance, especially with heterogeneous client data distributions. We propose FedSPD, an efficient personalized federated learning algorithm for the decentralized setting, and show that it learns accurate models in low-connectivity networks. To provide theoretical guarantees on convergence, we introduce a clustering-based framework that enables consensus on models for distinct data clusters while personalizing to unique mixtures of these clusters at different clients. This flexibility, allowing selective model updates based on data distribution, substantially reduces communication costs compared to prior work on personalized federated learning in decentralized settings. Experimental results on real-world datasets show that FedSPD outperforms multiple decentralized variants of existing personalized federated learning algorithms in scenarios with low-connectivity networks.
APA
Lin, I., Yagan, O. & Joe-Wong, C.. (2025). FedSPD: A Soft-clustering Approach for Personalized Decentralized Federated Learning. Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 286:2618-2641 Available from https://proceedings.mlr.press/v286/lin25a.html.

Related Material