Tighter Theory for Local SGD on Identical and Heterogeneous Data

[edit]

Ahmed Khaled, Konstantin Mishchenko, Peter Richtarik ;
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:4519-4529, 2020.

Abstract

We provide a new analysis of local SGD, removing unnecessary assumptions and elaborating on the difference between two data regimes: identical and heterogeneous. In both cases, we improve the existing theory and provide values of the optimal stepsize and optimal number of local iterations. Our bounds are based on a new notion of variance that is specific to local SGD methods with different data. The tightness of our results is guaranteed by recovering known statements when we plug $H=1$, where $H$ is the number of local steps. The empirical evidence further validates the severe impact of data heterogeneity on the performance of local SGD.

Related Material