Semi-Cyclic Stochastic Gradient Descent

[edit]

Hubert Eichner, Tomer Koren, Brendan Mcmahan, Nathan Srebro, Kunal Talwar ;
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:1764-1773, 2019.

Abstract

We consider convex SGD updates with a block-cyclic structure, i.e., where each cycle consists of a small number of blocks, each with many samples from a possibly different, block-specific, distribution. This situation arises, e.g., in Federated Learning where the mobile devices available for updates at different times during the day have different characteristics. We show that such block-cyclic structure can significantly deteriorate the performance of SGD, but propose a simple approach that allows prediction with the same guarantees as for i.i.d., non-cyclic, sampling.

Related Material