Is Consensus Acceleration Possible in Decentralized Optimization over Slowly Time-Varying Networks?

Dmitry Metelev, Alexander Rogozin, Dmitry Kovalev, Alexander Gasnikov
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:24532-24554, 2023.

Abstract

We consider decentralized optimization problems where one aims to minimize a sum of convex smooth objective functions distributed between nodes in the network. The links in the network can change from time to time. For the setting when the amount of changes is arbitrary, lower complexity bounds and corresponding optimal algorithms are known, and the consensus acceleration is not possible. However, in practice the magnitude of network changes may be limited. We derive lower complexity bounds for several regimes of velocity of networks changes. Moreover, we show how to obtain accelerated communication rates for a certain class of time-varying graphs using a specific consensus algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-metelev23a, title = {Is Consensus Acceleration Possible in Decentralized Optimization over Slowly Time-Varying Networks?}, author = {Metelev, Dmitry and Rogozin, Alexander and Kovalev, Dmitry and Gasnikov, Alexander}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {24532--24554}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/metelev23a/metelev23a.pdf}, url = {https://proceedings.mlr.press/v202/metelev23a.html}, abstract = {We consider decentralized optimization problems where one aims to minimize a sum of convex smooth objective functions distributed between nodes in the network. The links in the network can change from time to time. For the setting when the amount of changes is arbitrary, lower complexity bounds and corresponding optimal algorithms are known, and the consensus acceleration is not possible. However, in practice the magnitude of network changes may be limited. We derive lower complexity bounds for several regimes of velocity of networks changes. Moreover, we show how to obtain accelerated communication rates for a certain class of time-varying graphs using a specific consensus algorithm.} }
Endnote
%0 Conference Paper %T Is Consensus Acceleration Possible in Decentralized Optimization over Slowly Time-Varying Networks? %A Dmitry Metelev %A Alexander Rogozin %A Dmitry Kovalev %A Alexander Gasnikov %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-metelev23a %I PMLR %P 24532--24554 %U https://proceedings.mlr.press/v202/metelev23a.html %V 202 %X We consider decentralized optimization problems where one aims to minimize a sum of convex smooth objective functions distributed between nodes in the network. The links in the network can change from time to time. For the setting when the amount of changes is arbitrary, lower complexity bounds and corresponding optimal algorithms are known, and the consensus acceleration is not possible. However, in practice the magnitude of network changes may be limited. We derive lower complexity bounds for several regimes of velocity of networks changes. Moreover, we show how to obtain accelerated communication rates for a certain class of time-varying graphs using a specific consensus algorithm.
APA
Metelev, D., Rogozin, A., Kovalev, D. & Gasnikov, A.. (2023). Is Consensus Acceleration Possible in Decentralized Optimization over Slowly Time-Varying Networks?. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:24532-24554 Available from https://proceedings.mlr.press/v202/metelev23a.html.

Related Material