Optimal Tensor Methods in Smooth Convex and Uniformly ConvexOptimization

Alexander Gasnikov, Pavel Dvurechensky, Eduard Gorbunov, Evgeniya Vorontsova, Daniil Selikhanovych, César A. Uribe
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:1374-1391, 2019.

Abstract

We consider convex optimization problems with the objective function having Lipshitz-continuous $p$-th order derivative, where $p\geq 1$. We propose a new tensor method, which closes the gap between the lower $\Omega\left(\e^{-\frac{2}{3p+1}} \right)$ and upper $O\left(\e^{-\frac{1}{p+1}} \right)$ iteration complexity bounds for this class of optimization problems. We also consider uniformly convex functions, and show how the proposed method can be accelerated under this additional assumption. Moreover, we introduce a $p$-th order condition number which naturally arises in the complexity analysis of tensor methods under this assumption. Finally, we make a numerical study of the proposed optimal method and show that in practice it is faster than the best known accelerated tensor method. We also compare the performance of tensor methods for $p=2$ and $p=3$ and show that the 3rd-order method is superior to the 2nd-order method in practice.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-gasnikov19a, title = {Optimal Tensor Methods in Smooth Convex and Uniformly ConvexOptimization}, author = {Gasnikov, Alexander and Dvurechensky, Pavel and Gorbunov, Eduard and Vorontsova, Evgeniya and Selikhanovych, Daniil and Uribe, C\'esar A.}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {1374--1391}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/gasnikov19a/gasnikov19a.pdf}, url = {https://proceedings.mlr.press/v99/gasnikov19a.html}, abstract = {We consider convex optimization problems with the objective function having Lipshitz-continuous $p$-th order derivative, where $p\geq 1$. We propose a new tensor method, which closes the gap between the lower $\Omega\left(\e^{-\frac{2}{3p+1}} \right)$ and upper $O\left(\e^{-\frac{1}{p+1}} \right)$ iteration complexity bounds for this class of optimization problems. We also consider uniformly convex functions, and show how the proposed method can be accelerated under this additional assumption. Moreover, we introduce a $p$-th order condition number which naturally arises in the complexity analysis of tensor methods under this assumption. Finally, we make a numerical study of the proposed optimal method and show that in practice it is faster than the best known accelerated tensor method. We also compare the performance of tensor methods for $p=2$ and $p=3$ and show that the 3rd-order method is superior to the 2nd-order method in practice. } }
Endnote
%0 Conference Paper %T Optimal Tensor Methods in Smooth Convex and Uniformly ConvexOptimization %A Alexander Gasnikov %A Pavel Dvurechensky %A Eduard Gorbunov %A Evgeniya Vorontsova %A Daniil Selikhanovych %A César A. Uribe %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-gasnikov19a %I PMLR %P 1374--1391 %U https://proceedings.mlr.press/v99/gasnikov19a.html %V 99 %X We consider convex optimization problems with the objective function having Lipshitz-continuous $p$-th order derivative, where $p\geq 1$. We propose a new tensor method, which closes the gap between the lower $\Omega\left(\e^{-\frac{2}{3p+1}} \right)$ and upper $O\left(\e^{-\frac{1}{p+1}} \right)$ iteration complexity bounds for this class of optimization problems. We also consider uniformly convex functions, and show how the proposed method can be accelerated under this additional assumption. Moreover, we introduce a $p$-th order condition number which naturally arises in the complexity analysis of tensor methods under this assumption. Finally, we make a numerical study of the proposed optimal method and show that in practice it is faster than the best known accelerated tensor method. We also compare the performance of tensor methods for $p=2$ and $p=3$ and show that the 3rd-order method is superior to the 2nd-order method in practice.
APA
Gasnikov, A., Dvurechensky, P., Gorbunov, E., Vorontsova, E., Selikhanovych, D. & Uribe, C.A.. (2019). Optimal Tensor Methods in Smooth Convex and Uniformly ConvexOptimization. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:1374-1391 Available from https://proceedings.mlr.press/v99/gasnikov19a.html.

Related Material