Federated Learning under Arbitrary Communication Patterns

Dmitrii Avdiukhin, Shiva Kasiviswanathan
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:425-435, 2021.

Abstract

Federated Learning is a distributed learning setting where the goal is to train a centralized model with training data distributed over a large number of heterogeneous clients, each with unreliable and relatively slow network connections. A common optimization approach used in federated learning is based on the idea of local SGD: each client runs some number of SGD steps locally and then the updated local models are averaged to form the updated global model on the coordinating server. In this paper, we investigate the performance of an asynchronous version of local SGD wherein the clients can communicate with the server at arbitrary time intervals. Our main result shows that for smooth strongly convex and smooth nonconvex functions we achieve convergence rates that match the synchronous version that requires all clients to communicate simultaneously.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-avdiukhin21a, title = {Federated Learning under Arbitrary Communication Patterns}, author = {Avdiukhin, Dmitrii and Kasiviswanathan, Shiva}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {425--435}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/avdiukhin21a/avdiukhin21a.pdf}, url = {https://proceedings.mlr.press/v139/avdiukhin21a.html}, abstract = {Federated Learning is a distributed learning setting where the goal is to train a centralized model with training data distributed over a large number of heterogeneous clients, each with unreliable and relatively slow network connections. A common optimization approach used in federated learning is based on the idea of local SGD: each client runs some number of SGD steps locally and then the updated local models are averaged to form the updated global model on the coordinating server. In this paper, we investigate the performance of an asynchronous version of local SGD wherein the clients can communicate with the server at arbitrary time intervals. Our main result shows that for smooth strongly convex and smooth nonconvex functions we achieve convergence rates that match the synchronous version that requires all clients to communicate simultaneously.} }
Endnote
%0 Conference Paper %T Federated Learning under Arbitrary Communication Patterns %A Dmitrii Avdiukhin %A Shiva Kasiviswanathan %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-avdiukhin21a %I PMLR %P 425--435 %U https://proceedings.mlr.press/v139/avdiukhin21a.html %V 139 %X Federated Learning is a distributed learning setting where the goal is to train a centralized model with training data distributed over a large number of heterogeneous clients, each with unreliable and relatively slow network connections. A common optimization approach used in federated learning is based on the idea of local SGD: each client runs some number of SGD steps locally and then the updated local models are averaged to form the updated global model on the coordinating server. In this paper, we investigate the performance of an asynchronous version of local SGD wherein the clients can communicate with the server at arbitrary time intervals. Our main result shows that for smooth strongly convex and smooth nonconvex functions we achieve convergence rates that match the synchronous version that requires all clients to communicate simultaneously.
APA
Avdiukhin, D. & Kasiviswanathan, S.. (2021). Federated Learning under Arbitrary Communication Patterns. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:425-435 Available from https://proceedings.mlr.press/v139/avdiukhin21a.html.

Related Material