[edit]
Optimal Tensor Methods in Smooth Convex and Uniformly ConvexOptimization
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:1374-1391, 2019.
Abstract
We consider convex optimization problems with the objective function having Lipshitz-continuous $p$-th order derivative, where $p\geq 1$. We propose a new tensor method, which closes the gap between the lower $\Omega\left(\e^{-\frac{2}{3p+1}} \right)$ and upper $O\left(\e^{-\frac{1}{p+1}} \right)$ iteration complexity bounds for this class of optimization problems. We also consider uniformly convex functions, and show how the proposed method can be accelerated under this additional assumption. Moreover, we introduce a $p$-th order condition number which naturally arises in the complexity analysis of tensor methods under this assumption. Finally, we make a numerical study of the proposed optimal method and show that in practice it is faster than the best known accelerated tensor method. We also compare the performance of tensor methods for $p=2$ and $p=3$ and show that the 3rd-order method is superior to the 2nd-order method in practice.