Lower Complexity Bounds for Finite-Sum Convex-Concave Minimax Optimization Problems

Guangzeng Xie, Luo Luo, Yijiang Lian, Zhihua Zhang
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:10504-10513, 2020.

Abstract

This paper studies the lower bound complexity for minimax optimization problem whose objective function is the average of $n$ individual smooth convex-concave functions. We consider the algorithm which gets access to gradient and proximal oracle for each individual component. For the strongly-convex-strongly-concave case, we prove such an algorithm can not reach an $\varepsilon$-suboptimal point in fewer than $\Omega\left((n+\kappa)\log(1/\varepsilon)\right)$ iterations, where $\kappa$ is the condition number of the objective function. This lower bound matches the upper bound of the existing incremental first-order oracle algorithm stochastic variance-reduced extragradient. We develop a novel construction to show the above result, which partitions the tridiagonal matrix of classical examples into $n$ groups. This construction is friendly to the analysis of incremental gradient and proximal oracle and we also extend the analysis to general convex-concave cases.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-xie20d, title = {Lower Complexity Bounds for Finite-Sum Convex-Concave Minimax Optimization Problems}, author = {Xie, Guangzeng and Luo, Luo and Lian, Yijiang and Zhang, Zhihua}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {10504--10513}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/xie20d/xie20d.pdf}, url = {https://proceedings.mlr.press/v119/xie20d.html}, abstract = {This paper studies the lower bound complexity for minimax optimization problem whose objective function is the average of $n$ individual smooth convex-concave functions. We consider the algorithm which gets access to gradient and proximal oracle for each individual component. For the strongly-convex-strongly-concave case, we prove such an algorithm can not reach an $\varepsilon$-suboptimal point in fewer than $\Omega\left((n+\kappa)\log(1/\varepsilon)\right)$ iterations, where $\kappa$ is the condition number of the objective function. This lower bound matches the upper bound of the existing incremental first-order oracle algorithm stochastic variance-reduced extragradient. We develop a novel construction to show the above result, which partitions the tridiagonal matrix of classical examples into $n$ groups. This construction is friendly to the analysis of incremental gradient and proximal oracle and we also extend the analysis to general convex-concave cases.} }
Endnote
%0 Conference Paper %T Lower Complexity Bounds for Finite-Sum Convex-Concave Minimax Optimization Problems %A Guangzeng Xie %A Luo Luo %A Yijiang Lian %A Zhihua Zhang %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-xie20d %I PMLR %P 10504--10513 %U https://proceedings.mlr.press/v119/xie20d.html %V 119 %X This paper studies the lower bound complexity for minimax optimization problem whose objective function is the average of $n$ individual smooth convex-concave functions. We consider the algorithm which gets access to gradient and proximal oracle for each individual component. For the strongly-convex-strongly-concave case, we prove such an algorithm can not reach an $\varepsilon$-suboptimal point in fewer than $\Omega\left((n+\kappa)\log(1/\varepsilon)\right)$ iterations, where $\kappa$ is the condition number of the objective function. This lower bound matches the upper bound of the existing incremental first-order oracle algorithm stochastic variance-reduced extragradient. We develop a novel construction to show the above result, which partitions the tridiagonal matrix of classical examples into $n$ groups. This construction is friendly to the analysis of incremental gradient and proximal oracle and we also extend the analysis to general convex-concave cases.
APA
Xie, G., Luo, L., Lian, Y. & Zhang, Z.. (2020). Lower Complexity Bounds for Finite-Sum Convex-Concave Minimax Optimization Problems. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:10504-10513 Available from https://proceedings.mlr.press/v119/xie20d.html.

Related Material