Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives

Alexander Gasnikov, Pavel Dvurechensky, Eduard Gorbunov, Evgeniya Vorontsova, Daniil Selikhanovych, César A. Uribe, Bo Jiang, Haoyue Wang, Shuzhong Zhang, Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford
; Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:1392-1393, 2019.

Abstract

In this merged paper, we consider the problem of minimizing a convex function with Lipschitz-continuous $p$-th order derivatives. Given an oracle which when queried at a point returns the first $p$-derivatives of the function at that point we provide some methods which compute an $\e$ approximate minimizer in $O\left(\e^{-\frac{2}{3p+1}} \right)$ iterations. These methods match known lower bounds up to polylogarithmic factors for constant $p$.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-gasnikov19b, title = {Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives}, author = {Gasnikov, Alexander and Dvurechensky, Pavel and Gorbunov, Eduard and Vorontsova, Evgeniya and Selikhanovych, Daniil and Uribe, C\'esar A. and Jiang, Bo and Wang, Haoyue and Zhang, Shuzhong and Bubeck, S\'ebastien and Jiang, Qijia and Lee, Yin Tat and Li, Yuanzhi and Sidford, Aaron}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {1392--1393}, year = {2019}, editor = {Alina Beygelzimer and Daniel Hsu}, volume = {99}, series = {Proceedings of Machine Learning Research}, address = {Phoenix, USA}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/gasnikov19b/gasnikov19b.pdf}, url = {http://proceedings.mlr.press/v99/gasnikov19b.html}, abstract = {In this merged paper, we consider the problem of minimizing a convex function with Lipschitz-continuous $p$-th order derivatives. Given an oracle which when queried at a point returns the first $p$-derivatives of the function at that point we provide some methods which compute an $\e$ approximate minimizer in $O\left(\e^{-\frac{2}{3p+1}} \right)$ iterations. These methods match known lower bounds up to polylogarithmic factors for constant $p$.} }
Endnote
%0 Conference Paper %T Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives %A Alexander Gasnikov %A Pavel Dvurechensky %A Eduard Gorbunov %A Evgeniya Vorontsova %A Daniil Selikhanovych %A César A. Uribe %A Bo Jiang %A Haoyue Wang %A Shuzhong Zhang %A Sébastien Bubeck %A Qijia Jiang %A Yin Tat Lee %A Yuanzhi Li %A Aaron Sidford %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-gasnikov19b %I PMLR %J Proceedings of Machine Learning Research %P 1392--1393 %U http://proceedings.mlr.press %V 99 %W PMLR %X In this merged paper, we consider the problem of minimizing a convex function with Lipschitz-continuous $p$-th order derivatives. Given an oracle which when queried at a point returns the first $p$-derivatives of the function at that point we provide some methods which compute an $\e$ approximate minimizer in $O\left(\e^{-\frac{2}{3p+1}} \right)$ iterations. These methods match known lower bounds up to polylogarithmic factors for constant $p$.
APA
Gasnikov, A., Dvurechensky, P., Gorbunov, E., Vorontsova, E., Selikhanovych, D., Uribe, C.A., Jiang, B., Wang, H., Zhang, S., Bubeck, S., Jiang, Q., Lee, Y.T., Li, Y. & Sidford, A.. (2019). Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives. Proceedings of the Thirty-Second Conference on Learning Theory, in PMLR 99:1392-1393

Related Material