ProxPDA: The Proximal PrimalDual Algorithm for Fast Distributed Nonconvex Optimization and Learning Over Networks
[edit]
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:15291538, 2017.
Abstract
In this paper we consider nonconvex optimization and learning over a network of distributed nodes. We develop a Proximal PrimalDual Algorithm (ProxPDA), which enables the network nodes to distributedly and collectively compute the set of firstorder stationary solutions in a global sublinear manner [with a rate of $O(1/r)$, where $r$ is the iteration counter]. To the best of our knowledge, this is the first algorithm that enables distributed nonconvex optimization with global rate guarantees. Our numerical experiments also demonstrate the effectiveness of the proposed algorithm.
Related Material


