[edit]

# Near-optimal method for highly smooth convex optimization

*Proceedings of the Thirty-Second Conference on Learning Theory*, PMLR 99:492-507, 2019.

#### Abstract

We propose a near-optimal method for highly smooth convex optimization. More precisely, in the oracle model where one obtains the $p^{th}$ order Taylor expansion of a function at the query point, we propose a method with rate of convergence $\tilde{O}(1/k^{\frac{ 3p +1}{2}})$ after $k$ queries to the oracle for any convex function whose $p^{th}$ order derivative is Lipschitz.