Exploiting Similarity for Computation and Communication-Efficient Decentralized Optimization

Yuki Takezawa, Xiaowen Jiang, Anton Rodomanov, Sebastian U Stich
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:58359-58405, 2025.

Abstract

Reducing communication complexity is critical for efficient decentralized optimization. The proximal decentralized optimization (PDO) framework is particularly appealing, as methods within this framework can exploit functional similarity among nodes to reduce communication rounds. Specifically, when local functions at different nodes are similar, these methods achieve faster convergence with fewer communication steps. However, existing PDO methods often require highly accurate solutions to subproblems associated with the proximal operator, resulting in significant computational overhead. In this work, we propose the Stabilized Proximal Decentralized Optimization (SPDO) method, which achieves state-of-the-art communication and computational complexities within the PDO framework. Additionally, we refine the analysis of existing PDO methods by relaxing subproblem accuracy requirements and leveraging average functional similarity. Experimental results demonstrate that SPDO significantly outperforms existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-takezawa25a, title = {Exploiting Similarity for Computation and Communication-Efficient Decentralized Optimization}, author = {Takezawa, Yuki and Jiang, Xiaowen and Rodomanov, Anton and Stich, Sebastian U}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {58359--58405}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/takezawa25a/takezawa25a.pdf}, url = {https://proceedings.mlr.press/v267/takezawa25a.html}, abstract = {Reducing communication complexity is critical for efficient decentralized optimization. The proximal decentralized optimization (PDO) framework is particularly appealing, as methods within this framework can exploit functional similarity among nodes to reduce communication rounds. Specifically, when local functions at different nodes are similar, these methods achieve faster convergence with fewer communication steps. However, existing PDO methods often require highly accurate solutions to subproblems associated with the proximal operator, resulting in significant computational overhead. In this work, we propose the Stabilized Proximal Decentralized Optimization (SPDO) method, which achieves state-of-the-art communication and computational complexities within the PDO framework. Additionally, we refine the analysis of existing PDO methods by relaxing subproblem accuracy requirements and leveraging average functional similarity. Experimental results demonstrate that SPDO significantly outperforms existing methods.} }
Endnote
%0 Conference Paper %T Exploiting Similarity for Computation and Communication-Efficient Decentralized Optimization %A Yuki Takezawa %A Xiaowen Jiang %A Anton Rodomanov %A Sebastian U Stich %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-takezawa25a %I PMLR %P 58359--58405 %U https://proceedings.mlr.press/v267/takezawa25a.html %V 267 %X Reducing communication complexity is critical for efficient decentralized optimization. The proximal decentralized optimization (PDO) framework is particularly appealing, as methods within this framework can exploit functional similarity among nodes to reduce communication rounds. Specifically, when local functions at different nodes are similar, these methods achieve faster convergence with fewer communication steps. However, existing PDO methods often require highly accurate solutions to subproblems associated with the proximal operator, resulting in significant computational overhead. In this work, we propose the Stabilized Proximal Decentralized Optimization (SPDO) method, which achieves state-of-the-art communication and computational complexities within the PDO framework. Additionally, we refine the analysis of existing PDO methods by relaxing subproblem accuracy requirements and leveraging average functional similarity. Experimental results demonstrate that SPDO significantly outperforms existing methods.
APA
Takezawa, Y., Jiang, X., Rodomanov, A. & Stich, S.U.. (2025). Exploiting Similarity for Computation and Communication-Efficient Decentralized Optimization. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:58359-58405 Available from https://proceedings.mlr.press/v267/takezawa25a.html.

Related Material