A Doubly Recursive Stochastic Compositional Gradient Descent Method for Federated Multi-Level Compositional Optimization

Hongchang Gao
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:14540-14610, 2024.

Abstract

Federated compositional optimization has been actively studied in the past few years. However, existing methods mainly focus on the two-level compositional optimization problem, which cannot be directly applied to the multi-level counterparts. Moreover, the convergence rate of existing federated two-level compositional optimization learning algorithms fails to achieve linear speedup with respect to the number of workers under heterogeneous settings. After identifying the reason for this failure, we developed a novel federated stochastic multi-level compositional optimization algorithm by introducing a novel Jacobian-vector product estimator. This innovation mitigates both the heterogeneity issue and the communication efficiency issue simultaneously. We then theoretically proved that our algorithm can achieve the level-independent and linear speedup convergence rate for nonconvex problems. To our knowledge, this is the first time that a federated learning algorithm can achieve such a favorable convergence rate for multi-level compositional problems. Moreover, experimental results confirm the efficacy of our algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-gao24a, title = {A Doubly Recursive Stochastic Compositional Gradient Descent Method for Federated Multi-Level Compositional Optimization}, author = {Gao, Hongchang}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {14540--14610}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/gao24a/gao24a.pdf}, url = {https://proceedings.mlr.press/v235/gao24a.html}, abstract = {Federated compositional optimization has been actively studied in the past few years. However, existing methods mainly focus on the two-level compositional optimization problem, which cannot be directly applied to the multi-level counterparts. Moreover, the convergence rate of existing federated two-level compositional optimization learning algorithms fails to achieve linear speedup with respect to the number of workers under heterogeneous settings. After identifying the reason for this failure, we developed a novel federated stochastic multi-level compositional optimization algorithm by introducing a novel Jacobian-vector product estimator. This innovation mitigates both the heterogeneity issue and the communication efficiency issue simultaneously. We then theoretically proved that our algorithm can achieve the level-independent and linear speedup convergence rate for nonconvex problems. To our knowledge, this is the first time that a federated learning algorithm can achieve such a favorable convergence rate for multi-level compositional problems. Moreover, experimental results confirm the efficacy of our algorithm.} }
Endnote
%0 Conference Paper %T A Doubly Recursive Stochastic Compositional Gradient Descent Method for Federated Multi-Level Compositional Optimization %A Hongchang Gao %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-gao24a %I PMLR %P 14540--14610 %U https://proceedings.mlr.press/v235/gao24a.html %V 235 %X Federated compositional optimization has been actively studied in the past few years. However, existing methods mainly focus on the two-level compositional optimization problem, which cannot be directly applied to the multi-level counterparts. Moreover, the convergence rate of existing federated two-level compositional optimization learning algorithms fails to achieve linear speedup with respect to the number of workers under heterogeneous settings. After identifying the reason for this failure, we developed a novel federated stochastic multi-level compositional optimization algorithm by introducing a novel Jacobian-vector product estimator. This innovation mitigates both the heterogeneity issue and the communication efficiency issue simultaneously. We then theoretically proved that our algorithm can achieve the level-independent and linear speedup convergence rate for nonconvex problems. To our knowledge, this is the first time that a federated learning algorithm can achieve such a favorable convergence rate for multi-level compositional problems. Moreover, experimental results confirm the efficacy of our algorithm.
APA
Gao, H.. (2024). A Doubly Recursive Stochastic Compositional Gradient Descent Method for Federated Multi-Level Compositional Optimization. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:14540-14610 Available from https://proceedings.mlr.press/v235/gao24a.html.

Related Material