An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization

Lesi Chen, Haishan Ye, Luo Luo
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:1990-1998, 2024.

Abstract

This paper studies the stochastic nonconvex-strongly-concave minimax optimization over a multi-agent network. We propose an efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the $\epsilon$-stationary points. Concretely, it requires $\mathcal{O}(\min (\kappa^3\epsilon^{-3},\kappa^2 \sqrt{N} \epsilon^{-2} ))$ stochastic first-order oracle (SFO) calls and $\tilde \mathcal O(\kappa^2 \epsilon^{-2})$ communication rounds, where $\kappa$ is the condition number and $N$ is the total number of individual functions. Our numerical experiments also validate the superiority of DREAM over previous methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-chen24b, title = { An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization }, author = {Chen, Lesi and Ye, Haishan and Luo, Luo}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {1990--1998}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/chen24b/chen24b.pdf}, url = {https://proceedings.mlr.press/v238/chen24b.html}, abstract = { This paper studies the stochastic nonconvex-strongly-concave minimax optimization over a multi-agent network. We propose an efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the $\epsilon$-stationary points. Concretely, it requires $\mathcal{O}(\min (\kappa^3\epsilon^{-3},\kappa^2 \sqrt{N} \epsilon^{-2} ))$ stochastic first-order oracle (SFO) calls and $\tilde \mathcal O(\kappa^2 \epsilon^{-2})$ communication rounds, where $\kappa$ is the condition number and $N$ is the total number of individual functions. Our numerical experiments also validate the superiority of DREAM over previous methods. } }
Endnote
%0 Conference Paper %T An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization %A Lesi Chen %A Haishan Ye %A Luo Luo %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-chen24b %I PMLR %P 1990--1998 %U https://proceedings.mlr.press/v238/chen24b.html %V 238 %X This paper studies the stochastic nonconvex-strongly-concave minimax optimization over a multi-agent network. We propose an efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the $\epsilon$-stationary points. Concretely, it requires $\mathcal{O}(\min (\kappa^3\epsilon^{-3},\kappa^2 \sqrt{N} \epsilon^{-2} ))$ stochastic first-order oracle (SFO) calls and $\tilde \mathcal O(\kappa^2 \epsilon^{-2})$ communication rounds, where $\kappa$ is the condition number and $N$ is the total number of individual functions. Our numerical experiments also validate the superiority of DREAM over previous methods.
APA
Chen, L., Ye, H. & Luo, L.. (2024). An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:1990-1998 Available from https://proceedings.mlr.press/v238/chen24b.html.

Related Material