The Min-Max Complexity of Distributed Stochastic Convex Optimization with Intermittent Communication

Blake E Woodworth, Brian Bullins, Ohad Shamir, Nathan Srebro
Proceedings of Thirty Fourth Conference on Learning Theory, PMLR 134:4386-4437, 2021.

Abstract

We resolve the min-max complexity of distributed stochastic convex optimization (up to a log factor) in the intermittent communication setting, where $M$ machines work in parallel over the course of $R$ rounds of communication to optimize the objective, and during each round of communication, each machine may sequentially compute $K$ stochastic gradient estimates. We present a novel lower bound with a matching upper bound that establishes an optimal algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v134-woodworth21a, title = {The Min-Max Complexity of Distributed Stochastic Convex Optimization with Intermittent Communication}, author = {Woodworth, Blake E and Bullins, Brian and Shamir, Ohad and Srebro, Nathan}, booktitle = {Proceedings of Thirty Fourth Conference on Learning Theory}, pages = {4386--4437}, year = {2021}, editor = {Belkin, Mikhail and Kpotufe, Samory}, volume = {134}, series = {Proceedings of Machine Learning Research}, month = {15--19 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v134/woodworth21a/woodworth21a.pdf}, url = {https://proceedings.mlr.press/v134/woodworth21a.html}, abstract = {We resolve the min-max complexity of distributed stochastic convex optimization (up to a log factor) in the intermittent communication setting, where $M$ machines work in parallel over the course of $R$ rounds of communication to optimize the objective, and during each round of communication, each machine may sequentially compute $K$ stochastic gradient estimates. We present a novel lower bound with a matching upper bound that establishes an optimal algorithm.} }
Endnote
%0 Conference Paper %T The Min-Max Complexity of Distributed Stochastic Convex Optimization with Intermittent Communication %A Blake E Woodworth %A Brian Bullins %A Ohad Shamir %A Nathan Srebro %B Proceedings of Thirty Fourth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2021 %E Mikhail Belkin %E Samory Kpotufe %F pmlr-v134-woodworth21a %I PMLR %P 4386--4437 %U https://proceedings.mlr.press/v134/woodworth21a.html %V 134 %X We resolve the min-max complexity of distributed stochastic convex optimization (up to a log factor) in the intermittent communication setting, where $M$ machines work in parallel over the course of $R$ rounds of communication to optimize the objective, and during each round of communication, each machine may sequentially compute $K$ stochastic gradient estimates. We present a novel lower bound with a matching upper bound that establishes an optimal algorithm.
APA
Woodworth, B.E., Bullins, B., Shamir, O. & Srebro, N.. (2021). The Min-Max Complexity of Distributed Stochastic Convex Optimization with Intermittent Communication. Proceedings of Thirty Fourth Conference on Learning Theory, in Proceedings of Machine Learning Research 134:4386-4437 Available from https://proceedings.mlr.press/v134/woodworth21a.html.

Related Material