Stochastic Smoothed Gradient Descent Ascent for Federated Minimax Optimization

Wei Shen, Minhui Huang, Jiawei Zhang, Cong Shen
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:3988-3996, 2024.

Abstract

In recent years, federated minimax optimization has attracted growing interest due to its extensive applications in various machine learning tasks. While Smoothed Alternative Gradient Descent Ascent (Smoothed-AGDA) has proved successful in centralized nonconvex minimax optimization, how and whether smoothing techniques could be helpful in a federated setting remains unexplored. In this paper, we propose a new algorithm termed Federated Stochastic Smoothed Gradient Descent Ascent (FESS-GDA), which utilizes the smoothing technique for federated minimax optimization. We prove that FESS-GDA can be uniformly applied to solve several classes of federated minimax problems and prove new or better analytical convergence results for these settings. We showcase the practical efficiency of FESS-GDA in practical federated learning tasks of training generative adversarial networks (GANs) and fair classification.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-shen24c, title = { Stochastic Smoothed Gradient Descent Ascent for Federated Minimax Optimization }, author = {Shen, Wei and Huang, Minhui and Zhang, Jiawei and Shen, Cong}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {3988--3996}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/shen24c/shen24c.pdf}, url = {https://proceedings.mlr.press/v238/shen24c.html}, abstract = { In recent years, federated minimax optimization has attracted growing interest due to its extensive applications in various machine learning tasks. While Smoothed Alternative Gradient Descent Ascent (Smoothed-AGDA) has proved successful in centralized nonconvex minimax optimization, how and whether smoothing techniques could be helpful in a federated setting remains unexplored. In this paper, we propose a new algorithm termed Federated Stochastic Smoothed Gradient Descent Ascent (FESS-GDA), which utilizes the smoothing technique for federated minimax optimization. We prove that FESS-GDA can be uniformly applied to solve several classes of federated minimax problems and prove new or better analytical convergence results for these settings. We showcase the practical efficiency of FESS-GDA in practical federated learning tasks of training generative adversarial networks (GANs) and fair classification. } }
Endnote
%0 Conference Paper %T Stochastic Smoothed Gradient Descent Ascent for Federated Minimax Optimization %A Wei Shen %A Minhui Huang %A Jiawei Zhang %A Cong Shen %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-shen24c %I PMLR %P 3988--3996 %U https://proceedings.mlr.press/v238/shen24c.html %V 238 %X In recent years, federated minimax optimization has attracted growing interest due to its extensive applications in various machine learning tasks. While Smoothed Alternative Gradient Descent Ascent (Smoothed-AGDA) has proved successful in centralized nonconvex minimax optimization, how and whether smoothing techniques could be helpful in a federated setting remains unexplored. In this paper, we propose a new algorithm termed Federated Stochastic Smoothed Gradient Descent Ascent (FESS-GDA), which utilizes the smoothing technique for federated minimax optimization. We prove that FESS-GDA can be uniformly applied to solve several classes of federated minimax problems and prove new or better analytical convergence results for these settings. We showcase the practical efficiency of FESS-GDA in practical federated learning tasks of training generative adversarial networks (GANs) and fair classification.
APA
Shen, W., Huang, M., Zhang, J. & Shen, C.. (2024). Stochastic Smoothed Gradient Descent Ascent for Federated Minimax Optimization . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:3988-3996 Available from https://proceedings.mlr.press/v238/shen24c.html.

Related Material