Asynchronous SGD on Graphs: a Unified Framework for Asynchronous Decentralized and Federated Optimization

Mathieu Even, Anastasia Koloskova, Laurent Massoulie
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:64-72, 2024.

Abstract

Decentralized and asynchronous communications are two popular techniques to speedup communication complexity of distributed machine learning, by respectively removing the dependency over a central orchestrator and the need for synchronization. Yet, combining these two techniques together still remains a challenge. In this paper, we take a step in this direction and introduce Asynchronous SGD on Graphs (AGRAF SGD) — a general algorithmic framework that covers asynchronous versions of many popular algorithms including SGD, Decentralized SGD, Local SGD, FedBuff, thanks to its relaxed communication and computation assumptions. We provide rates of convergence under much milder assumptions than previous decentralized asynchronous works, while still recovering or even improving over the best know results for all the algorithms covered.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-even24a, title = {Asynchronous {SGD} on Graphs: a Unified Framework for Asynchronous Decentralized and Federated Optimization}, author = {Even, Mathieu and Koloskova, Anastasia and Massoulie, Laurent}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {64--72}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/even24a/even24a.pdf}, url = {https://proceedings.mlr.press/v238/even24a.html}, abstract = {Decentralized and asynchronous communications are two popular techniques to speedup communication complexity of distributed machine learning, by respectively removing the dependency over a central orchestrator and the need for synchronization. Yet, combining these two techniques together still remains a challenge. In this paper, we take a step in this direction and introduce Asynchronous SGD on Graphs (AGRAF SGD) — a general algorithmic framework that covers asynchronous versions of many popular algorithms including SGD, Decentralized SGD, Local SGD, FedBuff, thanks to its relaxed communication and computation assumptions. We provide rates of convergence under much milder assumptions than previous decentralized asynchronous works, while still recovering or even improving over the best know results for all the algorithms covered.} }
Endnote
%0 Conference Paper %T Asynchronous SGD on Graphs: a Unified Framework for Asynchronous Decentralized and Federated Optimization %A Mathieu Even %A Anastasia Koloskova %A Laurent Massoulie %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-even24a %I PMLR %P 64--72 %U https://proceedings.mlr.press/v238/even24a.html %V 238 %X Decentralized and asynchronous communications are two popular techniques to speedup communication complexity of distributed machine learning, by respectively removing the dependency over a central orchestrator and the need for synchronization. Yet, combining these two techniques together still remains a challenge. In this paper, we take a step in this direction and introduce Asynchronous SGD on Graphs (AGRAF SGD) — a general algorithmic framework that covers asynchronous versions of many popular algorithms including SGD, Decentralized SGD, Local SGD, FedBuff, thanks to its relaxed communication and computation assumptions. We provide rates of convergence under much milder assumptions than previous decentralized asynchronous works, while still recovering or even improving over the best know results for all the algorithms covered.
APA
Even, M., Koloskova, A. & Massoulie, L.. (2024). Asynchronous SGD on Graphs: a Unified Framework for Asynchronous Decentralized and Federated Optimization. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:64-72 Available from https://proceedings.mlr.press/v238/even24a.html.

Related Material