FedNest: Federated Bilevel, Minimax, and Compositional Optimization

Davoud Ataee Tarzanagh, Mingchen Li, Christos Thrampoulidis, Samet Oymak
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:21146-21179, 2022.

Abstract

Standard federated optimization methods successfully apply to stochastic problems with single-level structure. However, many contemporary ML problems - including adversarial robustness, hyperparameter tuning, actor-critic - fall under nested bilevel programming that subsumes minimax and compositional optimization. In this work, we propose FedNest: A federated alternating stochastic gradient method to address general nested problems. We establish provable convergence rates for FedNest in the presence of heterogeneous data and introduce variations for bilevel, minimax, and compositional optimization. FedNest introduces multiple innovations including federated hypergradient computation and variance reduction to address inner-level heterogeneity. We complement our theory with experiments on hyperparameter & hyper-representation learning and minimax optimization that demonstrate the benefits of our method in practice.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-tarzanagh22a, title = {{F}ed{N}est: Federated Bilevel, Minimax, and Compositional Optimization}, author = {Tarzanagh, Davoud Ataee and Li, Mingchen and Thrampoulidis, Christos and Oymak, Samet}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {21146--21179}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/tarzanagh22a/tarzanagh22a.pdf}, url = {https://proceedings.mlr.press/v162/tarzanagh22a.html}, abstract = {Standard federated optimization methods successfully apply to stochastic problems with single-level structure. However, many contemporary ML problems - including adversarial robustness, hyperparameter tuning, actor-critic - fall under nested bilevel programming that subsumes minimax and compositional optimization. In this work, we propose FedNest: A federated alternating stochastic gradient method to address general nested problems. We establish provable convergence rates for FedNest in the presence of heterogeneous data and introduce variations for bilevel, minimax, and compositional optimization. FedNest introduces multiple innovations including federated hypergradient computation and variance reduction to address inner-level heterogeneity. We complement our theory with experiments on hyperparameter & hyper-representation learning and minimax optimization that demonstrate the benefits of our method in practice.} }
Endnote
%0 Conference Paper %T FedNest: Federated Bilevel, Minimax, and Compositional Optimization %A Davoud Ataee Tarzanagh %A Mingchen Li %A Christos Thrampoulidis %A Samet Oymak %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-tarzanagh22a %I PMLR %P 21146--21179 %U https://proceedings.mlr.press/v162/tarzanagh22a.html %V 162 %X Standard federated optimization methods successfully apply to stochastic problems with single-level structure. However, many contemporary ML problems - including adversarial robustness, hyperparameter tuning, actor-critic - fall under nested bilevel programming that subsumes minimax and compositional optimization. In this work, we propose FedNest: A federated alternating stochastic gradient method to address general nested problems. We establish provable convergence rates for FedNest in the presence of heterogeneous data and introduce variations for bilevel, minimax, and compositional optimization. FedNest introduces multiple innovations including federated hypergradient computation and variance reduction to address inner-level heterogeneity. We complement our theory with experiments on hyperparameter & hyper-representation learning and minimax optimization that demonstrate the benefits of our method in practice.
APA
Tarzanagh, D.A., Li, M., Thrampoulidis, C. & Oymak, S.. (2022). FedNest: Federated Bilevel, Minimax, and Compositional Optimization. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:21146-21179 Available from https://proceedings.mlr.press/v162/tarzanagh22a.html.

Related Material