Complexity Guarantees for Polyak Steps with Momentum

Mathieu Barré, Adrien Taylor, Alexandre d’Aspremont
Proceedings of Thirty Third Conference on Learning Theory, PMLR 125:452-478, 2020.

Abstract

In smooth strongly convex optimization, knowledge of the strong convexity parameter is critical for obtaining simple methods with accelerated rates. In this work, we study a class of methods, based on Polyak steps, where this knowledge is substituted by that of the optimal value, $f_*$. We first show slightly improved convergence bounds than previously known for the classical case of simple gradient descent with Polyak steps, we then derive an accelerated gradient method with Polyak steps and momentum, along with convergence guarantees.

Cite this Paper


BibTeX
@InProceedings{pmlr-v125-barre20a, title = {Complexity Guarantees for Polyak Steps with Momentum}, author = {Barr\'e, Mathieu and Taylor, Adrien and d'Aspremont, Alexandre}, booktitle = {Proceedings of Thirty Third Conference on Learning Theory}, pages = {452--478}, year = {2020}, editor = {Abernethy, Jacob and Agarwal, Shivani}, volume = {125}, series = {Proceedings of Machine Learning Research}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v125/barre20a/barre20a.pdf}, url = {https://proceedings.mlr.press/v125/barre20a.html}, abstract = { In smooth strongly convex optimization, knowledge of the strong convexity parameter is critical for obtaining simple methods with accelerated rates. In this work, we study a class of methods, based on Polyak steps, where this knowledge is substituted by that of the optimal value, $f_*$. We first show slightly improved convergence bounds than previously known for the classical case of simple gradient descent with Polyak steps, we then derive an accelerated gradient method with Polyak steps and momentum, along with convergence guarantees.} }
Endnote
%0 Conference Paper %T Complexity Guarantees for Polyak Steps with Momentum %A Mathieu Barré %A Adrien Taylor %A Alexandre d’Aspremont %B Proceedings of Thirty Third Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2020 %E Jacob Abernethy %E Shivani Agarwal %F pmlr-v125-barre20a %I PMLR %P 452--478 %U https://proceedings.mlr.press/v125/barre20a.html %V 125 %X In smooth strongly convex optimization, knowledge of the strong convexity parameter is critical for obtaining simple methods with accelerated rates. In this work, we study a class of methods, based on Polyak steps, where this knowledge is substituted by that of the optimal value, $f_*$. We first show slightly improved convergence bounds than previously known for the classical case of simple gradient descent with Polyak steps, we then derive an accelerated gradient method with Polyak steps and momentum, along with convergence guarantees.
APA
Barré, M., Taylor, A. & d’Aspremont, A.. (2020). Complexity Guarantees for Polyak Steps with Momentum. Proceedings of Thirty Third Conference on Learning Theory, in Proceedings of Machine Learning Research 125:452-478 Available from https://proceedings.mlr.press/v125/barre20a.html.

Related Material