Asynchronous Personalized Federated Learning with Irregular Clients

Zichen Ma, Yu Lu, Wenye Li, Shuguang Cui
Proceedings of The 14th Asian Conference on Machine Learning, PMLR 189:706-721, 2023.

Abstract

To provide intelligent and personalized models for clients, personalized federated learning (PFL) enables learning from data, identifying patterns, and making automated decisions in a privacy-preserving manner. PFL involves independent training for multiple clients with synchronous aggregation steps. However, the assumptions made by existing works are not realistic given the heterogeneity of clients. In particular, the volume and distribution of collected data vary in the training process, and the clients also vary in their available system configurations, which leads to vast heterogeneity in the system. To address these challenges, we present an \textit{asynchronous} method (AsyPFL), where clients learn personalized models w.r.t. local data by making the most informative parameters less volatile. The central server aggregates model parameters asynchronously. In addition, we also reformulate PFL by unifying both synchronous and asynchronous updating schemes with an asynchrony-related parameter. Theoretically, we show that AsyPFL’s convergence rate is state-of-the-art and provide guarantees of choosing key hyperparameters optimally. With these theoretical guarantees, we validate AsyPFL on different tasks with non-IID and staleness settings. The results indicate that, given a large proportion of irregular clients, AsyPFL excels at empirical performance compared with vanilla PFL algorithms on non-IID and IID cases.

Cite this Paper


BibTeX
@InProceedings{pmlr-v189-ma23b, title = {Asynchronous Personalized Federated Learning with Irregular Clients}, author = {Ma, Zichen and Lu, Yu and Li, Wenye and Cui, Shuguang}, booktitle = {Proceedings of The 14th Asian Conference on Machine Learning}, pages = {706--721}, year = {2023}, editor = {Khan, Emtiyaz and Gonen, Mehmet}, volume = {189}, series = {Proceedings of Machine Learning Research}, month = {12--14 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v189/ma23b/ma23b.pdf}, url = {https://proceedings.mlr.press/v189/ma23b.html}, abstract = {To provide intelligent and personalized models for clients, personalized federated learning (PFL) enables learning from data, identifying patterns, and making automated decisions in a privacy-preserving manner. PFL involves independent training for multiple clients with synchronous aggregation steps. However, the assumptions made by existing works are not realistic given the heterogeneity of clients. In particular, the volume and distribution of collected data vary in the training process, and the clients also vary in their available system configurations, which leads to vast heterogeneity in the system. To address these challenges, we present an \textit{asynchronous} method (AsyPFL), where clients learn personalized models w.r.t. local data by making the most informative parameters less volatile. The central server aggregates model parameters asynchronously. In addition, we also reformulate PFL by unifying both synchronous and asynchronous updating schemes with an asynchrony-related parameter. Theoretically, we show that AsyPFL’s convergence rate is state-of-the-art and provide guarantees of choosing key hyperparameters optimally. With these theoretical guarantees, we validate AsyPFL on different tasks with non-IID and staleness settings. The results indicate that, given a large proportion of irregular clients, AsyPFL excels at empirical performance compared with vanilla PFL algorithms on non-IID and IID cases.} }
Endnote
%0 Conference Paper %T Asynchronous Personalized Federated Learning with Irregular Clients %A Zichen Ma %A Yu Lu %A Wenye Li %A Shuguang Cui %B Proceedings of The 14th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Emtiyaz Khan %E Mehmet Gonen %F pmlr-v189-ma23b %I PMLR %P 706--721 %U https://proceedings.mlr.press/v189/ma23b.html %V 189 %X To provide intelligent and personalized models for clients, personalized federated learning (PFL) enables learning from data, identifying patterns, and making automated decisions in a privacy-preserving manner. PFL involves independent training for multiple clients with synchronous aggregation steps. However, the assumptions made by existing works are not realistic given the heterogeneity of clients. In particular, the volume and distribution of collected data vary in the training process, and the clients also vary in their available system configurations, which leads to vast heterogeneity in the system. To address these challenges, we present an \textit{asynchronous} method (AsyPFL), where clients learn personalized models w.r.t. local data by making the most informative parameters less volatile. The central server aggregates model parameters asynchronously. In addition, we also reformulate PFL by unifying both synchronous and asynchronous updating schemes with an asynchrony-related parameter. Theoretically, we show that AsyPFL’s convergence rate is state-of-the-art and provide guarantees of choosing key hyperparameters optimally. With these theoretical guarantees, we validate AsyPFL on different tasks with non-IID and staleness settings. The results indicate that, given a large proportion of irregular clients, AsyPFL excels at empirical performance compared with vanilla PFL algorithms on non-IID and IID cases.
APA
Ma, Z., Lu, Y., Li, W. & Cui, S.. (2023). Asynchronous Personalized Federated Learning with Irregular Clients. Proceedings of The 14th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 189:706-721 Available from https://proceedings.mlr.press/v189/ma23b.html.

Related Material