A Fast Optimistic Method for Monotone Variational Inequalities

Michael Sedlmayer, Dang-Khoa Nguyen, Radu Ioan Bot
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:30406-30438, 2023.

Abstract

We study monotone variational inequalities that can arise as optimality conditions for constrained convex optimization or convex-concave minimax problems and propose a novel algorithm that uses only one gradient/operator evaluation and one projection onto the constraint set per iteration. The algorithm, which we call fOGDA-VI, achieves a $o(\frac{1}{k})$ rate of convergence in terms of the restricted gap function as well as the natural residual for the last iterate. Moreover, we provide a convergence guarantee for the sequence of iterates to a solution of the variational inequality. These are the best theoretical convergence results for numerical methods for (only) monotone variational inequalities reported in the literature. To empirically validate our algorithm we investigate a two-player matrix game with mixed strategies of the two players. Concluding, we show promising results regarding the application of fOGDA-VI to the training of generative adversarial nets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-sedlmayer23a, title = {A Fast Optimistic Method for Monotone Variational Inequalities}, author = {Sedlmayer, Michael and Nguyen, Dang-Khoa and Bot, Radu Ioan}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {30406--30438}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/sedlmayer23a/sedlmayer23a.pdf}, url = {https://proceedings.mlr.press/v202/sedlmayer23a.html}, abstract = {We study monotone variational inequalities that can arise as optimality conditions for constrained convex optimization or convex-concave minimax problems and propose a novel algorithm that uses only one gradient/operator evaluation and one projection onto the constraint set per iteration. The algorithm, which we call fOGDA-VI, achieves a $o(\frac{1}{k})$ rate of convergence in terms of the restricted gap function as well as the natural residual for the last iterate. Moreover, we provide a convergence guarantee for the sequence of iterates to a solution of the variational inequality. These are the best theoretical convergence results for numerical methods for (only) monotone variational inequalities reported in the literature. To empirically validate our algorithm we investigate a two-player matrix game with mixed strategies of the two players. Concluding, we show promising results regarding the application of fOGDA-VI to the training of generative adversarial nets.} }
Endnote
%0 Conference Paper %T A Fast Optimistic Method for Monotone Variational Inequalities %A Michael Sedlmayer %A Dang-Khoa Nguyen %A Radu Ioan Bot %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-sedlmayer23a %I PMLR %P 30406--30438 %U https://proceedings.mlr.press/v202/sedlmayer23a.html %V 202 %X We study monotone variational inequalities that can arise as optimality conditions for constrained convex optimization or convex-concave minimax problems and propose a novel algorithm that uses only one gradient/operator evaluation and one projection onto the constraint set per iteration. The algorithm, which we call fOGDA-VI, achieves a $o(\frac{1}{k})$ rate of convergence in terms of the restricted gap function as well as the natural residual for the last iterate. Moreover, we provide a convergence guarantee for the sequence of iterates to a solution of the variational inequality. These are the best theoretical convergence results for numerical methods for (only) monotone variational inequalities reported in the literature. To empirically validate our algorithm we investigate a two-player matrix game with mixed strategies of the two players. Concluding, we show promising results regarding the application of fOGDA-VI to the training of generative adversarial nets.
APA
Sedlmayer, M., Nguyen, D. & Bot, R.I.. (2023). A Fast Optimistic Method for Monotone Variational Inequalities. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:30406-30438 Available from https://proceedings.mlr.press/v202/sedlmayer23a.html.

Related Material