Constant Stepsize Local GD for Logistic Regression: Acceleration by Instability

Michael Crawshaw, Blake Woodworth, Mingrui Liu
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:11465-11492, 2025.

Abstract

Existing analysis of Local (Stochastic) Gradient Descent for heterogeneous objectives requires stepsizes $\eta \leq 1/K$ where $K$ is the communication interval, which ensures monotonic decrease of the objective. In contrast, we analyze Local Gradient Descent for logistic regression with separable, heterogeneous data using any stepsize $\eta > 0$. With $R$ communication rounds and $M$ clients, we show convergence at a rate $\mathcal{O}(1/\eta K R)$ after an initial unstable phase lasting for $\widetilde{\mathcal{O}}(\eta K M)$ rounds. This improves upon the existing $\mathcal{O}(1/R)$ rate for general smooth, convex objectives. Our analysis parallels the single machine analysis of Wu et al. (2024) in which instability is caused by extremely large stepsizes, but in our setting another source of instability is large local updates with heterogeneous objectives.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-crawshaw25a, title = {Constant Stepsize Local {GD} for Logistic Regression: Acceleration by Instability}, author = {Crawshaw, Michael and Woodworth, Blake and Liu, Mingrui}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {11465--11492}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/crawshaw25a/crawshaw25a.pdf}, url = {https://proceedings.mlr.press/v267/crawshaw25a.html}, abstract = {Existing analysis of Local (Stochastic) Gradient Descent for heterogeneous objectives requires stepsizes $\eta \leq 1/K$ where $K$ is the communication interval, which ensures monotonic decrease of the objective. In contrast, we analyze Local Gradient Descent for logistic regression with separable, heterogeneous data using any stepsize $\eta > 0$. With $R$ communication rounds and $M$ clients, we show convergence at a rate $\mathcal{O}(1/\eta K R)$ after an initial unstable phase lasting for $\widetilde{\mathcal{O}}(\eta K M)$ rounds. This improves upon the existing $\mathcal{O}(1/R)$ rate for general smooth, convex objectives. Our analysis parallels the single machine analysis of Wu et al. (2024) in which instability is caused by extremely large stepsizes, but in our setting another source of instability is large local updates with heterogeneous objectives.} }
Endnote
%0 Conference Paper %T Constant Stepsize Local GD for Logistic Regression: Acceleration by Instability %A Michael Crawshaw %A Blake Woodworth %A Mingrui Liu %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-crawshaw25a %I PMLR %P 11465--11492 %U https://proceedings.mlr.press/v267/crawshaw25a.html %V 267 %X Existing analysis of Local (Stochastic) Gradient Descent for heterogeneous objectives requires stepsizes $\eta \leq 1/K$ where $K$ is the communication interval, which ensures monotonic decrease of the objective. In contrast, we analyze Local Gradient Descent for logistic regression with separable, heterogeneous data using any stepsize $\eta > 0$. With $R$ communication rounds and $M$ clients, we show convergence at a rate $\mathcal{O}(1/\eta K R)$ after an initial unstable phase lasting for $\widetilde{\mathcal{O}}(\eta K M)$ rounds. This improves upon the existing $\mathcal{O}(1/R)$ rate for general smooth, convex objectives. Our analysis parallels the single machine analysis of Wu et al. (2024) in which instability is caused by extremely large stepsizes, but in our setting another source of instability is large local updates with heterogeneous objectives.
APA
Crawshaw, M., Woodworth, B. & Liu, M.. (2025). Constant Stepsize Local GD for Logistic Regression: Acceleration by Instability. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:11465-11492 Available from https://proceedings.mlr.press/v267/crawshaw25a.html.

Related Material