The power of first-order smooth optimization for black-box non-smooth problems

Alexander Gasnikov, Anton Novitskii, Vasilii Novitskii, Farshed Abdukhakimov, Dmitry Kamzolov, Aleksandr Beznosikov, Martin Takac, Pavel Dvurechensky, Bin Gu
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:7241-7265, 2022.

Abstract

Gradient-free/zeroth-order methods for black-box convex optimization have been extensively studied in the last decade with the main focus on oracle calls complexity. In this paper, besides the oracle complexity, we focus also on iteration complexity, and propose a generic approach that, based on optimal first-order methods, allows to obtain in a black-box fashion new zeroth-order algorithms for non-smooth convex optimization problems. Our approach not only leads to optimal oracle complexity, but also allows to obtain iteration complexity similar to first-order methods, which, in turn, allows to exploit parallel computations to accelerate the convergence of our algorithms. We also elaborate on extensions for stochastic optimization problems, saddle-point problems, and distributed optimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-gasnikov22a, title = {The power of first-order smooth optimization for black-box non-smooth problems}, author = {Gasnikov, Alexander and Novitskii, Anton and Novitskii, Vasilii and Abdukhakimov, Farshed and Kamzolov, Dmitry and Beznosikov, Aleksandr and Takac, Martin and Dvurechensky, Pavel and Gu, Bin}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {7241--7265}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/gasnikov22a/gasnikov22a.pdf}, url = {https://proceedings.mlr.press/v162/gasnikov22a.html}, abstract = {Gradient-free/zeroth-order methods for black-box convex optimization have been extensively studied in the last decade with the main focus on oracle calls complexity. In this paper, besides the oracle complexity, we focus also on iteration complexity, and propose a generic approach that, based on optimal first-order methods, allows to obtain in a black-box fashion new zeroth-order algorithms for non-smooth convex optimization problems. Our approach not only leads to optimal oracle complexity, but also allows to obtain iteration complexity similar to first-order methods, which, in turn, allows to exploit parallel computations to accelerate the convergence of our algorithms. We also elaborate on extensions for stochastic optimization problems, saddle-point problems, and distributed optimization.} }
Endnote
%0 Conference Paper %T The power of first-order smooth optimization for black-box non-smooth problems %A Alexander Gasnikov %A Anton Novitskii %A Vasilii Novitskii %A Farshed Abdukhakimov %A Dmitry Kamzolov %A Aleksandr Beznosikov %A Martin Takac %A Pavel Dvurechensky %A Bin Gu %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-gasnikov22a %I PMLR %P 7241--7265 %U https://proceedings.mlr.press/v162/gasnikov22a.html %V 162 %X Gradient-free/zeroth-order methods for black-box convex optimization have been extensively studied in the last decade with the main focus on oracle calls complexity. In this paper, besides the oracle complexity, we focus also on iteration complexity, and propose a generic approach that, based on optimal first-order methods, allows to obtain in a black-box fashion new zeroth-order algorithms for non-smooth convex optimization problems. Our approach not only leads to optimal oracle complexity, but also allows to obtain iteration complexity similar to first-order methods, which, in turn, allows to exploit parallel computations to accelerate the convergence of our algorithms. We also elaborate on extensions for stochastic optimization problems, saddle-point problems, and distributed optimization.
APA
Gasnikov, A., Novitskii, A., Novitskii, V., Abdukhakimov, F., Kamzolov, D., Beznosikov, A., Takac, M., Dvurechensky, P. & Gu, B.. (2022). The power of first-order smooth optimization for black-box non-smooth problems. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:7241-7265 Available from https://proceedings.mlr.press/v162/gasnikov22a.html.

Related Material