Prox-PDA: The Proximal Primal-Dual Algorithm for Fast Distributed Nonconvex Optimization and Learning Over Networks

Mingyi Hong, Davood Hajinezhad, Ming-Min Zhao
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:1529-1538, 2017.

Abstract

In this paper we consider nonconvex optimization and learning over a network of distributed nodes. We develop a Proximal Primal-Dual Algorithm (Prox-PDA), which enables the network nodes to distributedly and collectively compute the set of first-order stationary solutions in a global sublinear manner [with a rate of $O(1/r)$, where $r$ is the iteration counter]. To the best of our knowledge, this is the first algorithm that enables distributed nonconvex optimization with global rate guarantees. Our numerical experiments also demonstrate the effectiveness of the proposed algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-hong17a, title = {Prox-{PDA}: The Proximal Primal-Dual Algorithm for Fast Distributed Nonconvex Optimization and Learning Over Networks}, author = {Mingyi Hong and Davood Hajinezhad and Ming-Min Zhao}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {1529--1538}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/hong17a/hong17a.pdf}, url = {https://proceedings.mlr.press/v70/hong17a.html}, abstract = {In this paper we consider nonconvex optimization and learning over a network of distributed nodes. We develop a Proximal Primal-Dual Algorithm (Prox-PDA), which enables the network nodes to distributedly and collectively compute the set of first-order stationary solutions in a global sublinear manner [with a rate of $O(1/r)$, where $r$ is the iteration counter]. To the best of our knowledge, this is the first algorithm that enables distributed nonconvex optimization with global rate guarantees. Our numerical experiments also demonstrate the effectiveness of the proposed algorithm.} }
Endnote
%0 Conference Paper %T Prox-PDA: The Proximal Primal-Dual Algorithm for Fast Distributed Nonconvex Optimization and Learning Over Networks %A Mingyi Hong %A Davood Hajinezhad %A Ming-Min Zhao %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-hong17a %I PMLR %P 1529--1538 %U https://proceedings.mlr.press/v70/hong17a.html %V 70 %X In this paper we consider nonconvex optimization and learning over a network of distributed nodes. We develop a Proximal Primal-Dual Algorithm (Prox-PDA), which enables the network nodes to distributedly and collectively compute the set of first-order stationary solutions in a global sublinear manner [with a rate of $O(1/r)$, where $r$ is the iteration counter]. To the best of our knowledge, this is the first algorithm that enables distributed nonconvex optimization with global rate guarantees. Our numerical experiments also demonstrate the effectiveness of the proposed algorithm.
APA
Hong, M., Hajinezhad, D. & Zhao, M.. (2017). Prox-PDA: The Proximal Primal-Dual Algorithm for Fast Distributed Nonconvex Optimization and Learning Over Networks. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:1529-1538 Available from https://proceedings.mlr.press/v70/hong17a.html.

Related Material