A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise

Francis Bach, Kfir Y Levy
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:164-194, 2019.

Abstract

We consider variational inequalities coming from monotone operators, a setting that includes convex minimization and convex-concave saddle-point problems. We assume an access to potentially noisy unbiased values of the monotone operators and assess convergence through a compatible gap function which corresponds to the standard optimality criteria in the aforementioned subcases. We present a universal algorithm for these inequalities based on the Mirror-Prox algorithm. Concretely, our algorithm \emph{simultaneously} achieves the optimal rates for the smooth/non-smooth, and noisy/noiseless settings. This is done without any prior knowledge of these properties, and in the general set-up of arbitrary norms and compatible Bregman divergences. For convex minimization and convex-concave saddle-point problems, this leads to new adaptive algorithms. Our method relies on a novel yet simple adaptive choice of the step-size, which can be seen as the appropriate extension of AdaGrad to handle constrained problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-bach19a, title = {A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise}, author = {Bach, Francis and Levy, Kfir Y}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {164--194}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/bach19a/bach19a.pdf}, url = {https://proceedings.mlr.press/v99/bach19a.html}, abstract = {We consider variational inequalities coming from monotone operators, a setting that includes convex minimization and convex-concave saddle-point problems. We assume an access to potentially noisy unbiased values of the monotone operators and assess convergence through a compatible gap function which corresponds to the standard optimality criteria in the aforementioned subcases. We present a universal algorithm for these inequalities based on the Mirror-Prox algorithm. Concretely, our algorithm \emph{simultaneously} achieves the optimal rates for the smooth/non-smooth, and noisy/noiseless settings. This is done without any prior knowledge of these properties, and in the general set-up of arbitrary norms and compatible Bregman divergences. For convex minimization and convex-concave saddle-point problems, this leads to new adaptive algorithms. Our method relies on a novel yet simple adaptive choice of the step-size, which can be seen as the appropriate extension of AdaGrad to handle constrained problems.} }
Endnote
%0 Conference Paper %T A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise %A Francis Bach %A Kfir Y Levy %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-bach19a %I PMLR %P 164--194 %U https://proceedings.mlr.press/v99/bach19a.html %V 99 %X We consider variational inequalities coming from monotone operators, a setting that includes convex minimization and convex-concave saddle-point problems. We assume an access to potentially noisy unbiased values of the monotone operators and assess convergence through a compatible gap function which corresponds to the standard optimality criteria in the aforementioned subcases. We present a universal algorithm for these inequalities based on the Mirror-Prox algorithm. Concretely, our algorithm \emph{simultaneously} achieves the optimal rates for the smooth/non-smooth, and noisy/noiseless settings. This is done without any prior knowledge of these properties, and in the general set-up of arbitrary norms and compatible Bregman divergences. For convex minimization and convex-concave saddle-point problems, this leads to new adaptive algorithms. Our method relies on a novel yet simple adaptive choice of the step-size, which can be seen as the appropriate extension of AdaGrad to handle constrained problems.
APA
Bach, F. & Levy, K.Y.. (2019). A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:164-194 Available from https://proceedings.mlr.press/v99/bach19a.html.

Related Material