Nesterov Meets Optimism: Rate-Optimal Separable Minimax Optimization

Chris Junchi Li, Huizhuo Yuan, Gauthier Gidel, Quanquan Gu, Michael Jordan
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:20351-20383, 2023.

Abstract

We propose a new first-order optimization algorithm — AcceleratedGradient-OptimisticGradient (AG-OG) Descent Ascent—for separable convex-concave minimax optimization. The main idea of our algorithm is to carefully leverage the structure of the minimax problem, performing Nesterov acceleration on the individual component and optimistic gradient on the coupling component. Equipped with proper restarting, we show that AG-OG achieves the optimal convergence rate (up to a constant) for a variety of settings, including bilinearly coupled strongly convex-strongly concave minimax optimization (bi-SC-SC), bilinearly coupled convex-strongly concave minimax optimization (bi-C-SC), and bilinear games. We also extend our algorithm to the stochastic setting and achieve the optimal convergence rate in both bi-SC-SC and bi-C-SC settings. AG-OG is the first single-call algorithm with optimal convergence rates in both deterministic and stochastic settings for bilinearly coupled minimax optimization problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-li23aq, title = {{N}esterov Meets Optimism: Rate-Optimal Separable Minimax Optimization}, author = {Li, Chris Junchi and Yuan, Huizhuo and Gidel, Gauthier and Gu, Quanquan and Jordan, Michael}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {20351--20383}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/li23aq/li23aq.pdf}, url = {https://proceedings.mlr.press/v202/li23aq.html}, abstract = {We propose a new first-order optimization algorithm — AcceleratedGradient-OptimisticGradient (AG-OG) Descent Ascent—for separable convex-concave minimax optimization. The main idea of our algorithm is to carefully leverage the structure of the minimax problem, performing Nesterov acceleration on the individual component and optimistic gradient on the coupling component. Equipped with proper restarting, we show that AG-OG achieves the optimal convergence rate (up to a constant) for a variety of settings, including bilinearly coupled strongly convex-strongly concave minimax optimization (bi-SC-SC), bilinearly coupled convex-strongly concave minimax optimization (bi-C-SC), and bilinear games. We also extend our algorithm to the stochastic setting and achieve the optimal convergence rate in both bi-SC-SC and bi-C-SC settings. AG-OG is the first single-call algorithm with optimal convergence rates in both deterministic and stochastic settings for bilinearly coupled minimax optimization problems.} }
Endnote
%0 Conference Paper %T Nesterov Meets Optimism: Rate-Optimal Separable Minimax Optimization %A Chris Junchi Li %A Huizhuo Yuan %A Gauthier Gidel %A Quanquan Gu %A Michael Jordan %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-li23aq %I PMLR %P 20351--20383 %U https://proceedings.mlr.press/v202/li23aq.html %V 202 %X We propose a new first-order optimization algorithm — AcceleratedGradient-OptimisticGradient (AG-OG) Descent Ascent—for separable convex-concave minimax optimization. The main idea of our algorithm is to carefully leverage the structure of the minimax problem, performing Nesterov acceleration on the individual component and optimistic gradient on the coupling component. Equipped with proper restarting, we show that AG-OG achieves the optimal convergence rate (up to a constant) for a variety of settings, including bilinearly coupled strongly convex-strongly concave minimax optimization (bi-SC-SC), bilinearly coupled convex-strongly concave minimax optimization (bi-C-SC), and bilinear games. We also extend our algorithm to the stochastic setting and achieve the optimal convergence rate in both bi-SC-SC and bi-C-SC settings. AG-OG is the first single-call algorithm with optimal convergence rates in both deterministic and stochastic settings for bilinearly coupled minimax optimization problems.
APA
Li, C.J., Yuan, H., Gidel, G., Gu, Q. & Jordan, M.. (2023). Nesterov Meets Optimism: Rate-Optimal Separable Minimax Optimization. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:20351-20383 Available from https://proceedings.mlr.press/v202/li23aq.html.

Related Material