Near-optimal method for highly smooth convex optimization

Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:492-507, 2019.

Abstract

We propose a near-optimal method for highly smooth convex optimization. More precisely, in the oracle model where one obtains the $p^{th}$ order Taylor expansion of a function at the query point, we propose a method with rate of convergence $\tilde{O}(1/k^{\frac{ 3p +1}{2}})$ after $k$ queries to the oracle for any convex function whose $p^{th}$ order derivative is Lipschitz.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-bubeck19a, title = {Near-optimal method for highly smooth convex optimization}, author = {Bubeck, S{\'e}bastien and Jiang, Qijia and Lee, Yin Tat and Li, Yuanzhi and Sidford, Aaron}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {492--507}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/bubeck19a/bubeck19a.pdf}, url = {https://proceedings.mlr.press/v99/bubeck19a.html}, abstract = {We propose a near-optimal method for highly smooth convex optimization. More precisely, in the oracle model where one obtains the $p^{th}$ order Taylor expansion of a function at the query point, we propose a method with rate of convergence $\tilde{O}(1/k^{\frac{ 3p +1}{2}})$ after $k$ queries to the oracle for any convex function whose $p^{th}$ order derivative is Lipschitz.} }
Endnote
%0 Conference Paper %T Near-optimal method for highly smooth convex optimization %A Sébastien Bubeck %A Qijia Jiang %A Yin Tat Lee %A Yuanzhi Li %A Aaron Sidford %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-bubeck19a %I PMLR %P 492--507 %U https://proceedings.mlr.press/v99/bubeck19a.html %V 99 %X We propose a near-optimal method for highly smooth convex optimization. More precisely, in the oracle model where one obtains the $p^{th}$ order Taylor expansion of a function at the query point, we propose a method with rate of convergence $\tilde{O}(1/k^{\frac{ 3p +1}{2}})$ after $k$ queries to the oracle for any convex function whose $p^{th}$ order derivative is Lipschitz.
APA
Bubeck, S., Jiang, Q., Lee, Y.T., Li, Y. & Sidford, A.. (2019). Near-optimal method for highly smooth convex optimization. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:492-507 Available from https://proceedings.mlr.press/v99/bubeck19a.html.

Related Material