When Extragradient Meets PAGE: Bridging Two Giants to Boost Variational Inequalities

Gleb Molodtsov, Valery Parfenov, Egor Petrov, Evseev Grigoriy, Daniil Medyakov, Aleksandr Beznosikov
Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, PMLR 286:3093-3122, 2025.

Abstract

Variational inequalities (VIs) have emerged as a universal framework for solving a wide range of problems. A broad spectrum of applications include optimization, equilibrium analysis, reinforcement learning, and the rapidly evolving field of generative adversarial networks (GANs). Stochastic methods have proven to be powerful tools for addressing such problems, but they often suffer from irreducible variance, necessitating the development of variance reduction techniques. Among these, SARAH-based algorithms have demonstrated remarkable practical effectiveness. In this work, we propose a new stochastic variance reduced algorithm for solving stochastic variational inequalities. We push the boundaries of existing methodologies by leveraging PAGE method to solve VIs. Unlike prior studies, which lacked theoretical guarantees under general assumptions, we establish rigorous convergence rates, thus closing a crucial gap in the literature. Our contributions extend both theoretical understanding and practical advancements in solving variational inequalities. To substantiate our claims, we conduct extensive experiments across diverse benchmarks including widely studied denoising task. The results consistently showcase the superior efficiency of our approach, underscoring its potential for real-world applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v286-molodtsov25a, title = {When Extragradient Meets PAGE: Bridging Two Giants to Boost Variational Inequalities}, author = {Molodtsov, Gleb and Parfenov, Valery and Petrov, Egor and Grigoriy, Evseev and Medyakov, Daniil and Beznosikov, Aleksandr}, booktitle = {Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence}, pages = {3093--3122}, year = {2025}, editor = {Chiappa, Silvia and Magliacane, Sara}, volume = {286}, series = {Proceedings of Machine Learning Research}, month = {21--25 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v286/main/assets/molodtsov25a/molodtsov25a.pdf}, url = {https://proceedings.mlr.press/v286/molodtsov25a.html}, abstract = {Variational inequalities (VIs) have emerged as a universal framework for solving a wide range of problems. A broad spectrum of applications include optimization, equilibrium analysis, reinforcement learning, and the rapidly evolving field of generative adversarial networks (GANs). Stochastic methods have proven to be powerful tools for addressing such problems, but they often suffer from irreducible variance, necessitating the development of variance reduction techniques. Among these, SARAH-based algorithms have demonstrated remarkable practical effectiveness. In this work, we propose a new stochastic variance reduced algorithm for solving stochastic variational inequalities. We push the boundaries of existing methodologies by leveraging PAGE method to solve VIs. Unlike prior studies, which lacked theoretical guarantees under general assumptions, we establish rigorous convergence rates, thus closing a crucial gap in the literature. Our contributions extend both theoretical understanding and practical advancements in solving variational inequalities. To substantiate our claims, we conduct extensive experiments across diverse benchmarks including widely studied denoising task. The results consistently showcase the superior efficiency of our approach, underscoring its potential for real-world applications.} }
Endnote
%0 Conference Paper %T When Extragradient Meets PAGE: Bridging Two Giants to Boost Variational Inequalities %A Gleb Molodtsov %A Valery Parfenov %A Egor Petrov %A Evseev Grigoriy %A Daniil Medyakov %A Aleksandr Beznosikov %B Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2025 %E Silvia Chiappa %E Sara Magliacane %F pmlr-v286-molodtsov25a %I PMLR %P 3093--3122 %U https://proceedings.mlr.press/v286/molodtsov25a.html %V 286 %X Variational inequalities (VIs) have emerged as a universal framework for solving a wide range of problems. A broad spectrum of applications include optimization, equilibrium analysis, reinforcement learning, and the rapidly evolving field of generative adversarial networks (GANs). Stochastic methods have proven to be powerful tools for addressing such problems, but they often suffer from irreducible variance, necessitating the development of variance reduction techniques. Among these, SARAH-based algorithms have demonstrated remarkable practical effectiveness. In this work, we propose a new stochastic variance reduced algorithm for solving stochastic variational inequalities. We push the boundaries of existing methodologies by leveraging PAGE method to solve VIs. Unlike prior studies, which lacked theoretical guarantees under general assumptions, we establish rigorous convergence rates, thus closing a crucial gap in the literature. Our contributions extend both theoretical understanding and practical advancements in solving variational inequalities. To substantiate our claims, we conduct extensive experiments across diverse benchmarks including widely studied denoising task. The results consistently showcase the superior efficiency of our approach, underscoring its potential for real-world applications.
APA
Molodtsov, G., Parfenov, V., Petrov, E., Grigoriy, E., Medyakov, D. & Beznosikov, A.. (2025). When Extragradient Meets PAGE: Bridging Two Giants to Boost Variational Inequalities. Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 286:3093-3122 Available from https://proceedings.mlr.press/v286/molodtsov25a.html.

Related Material