Complexity Guarantees for Polyak Steps with Momentum

Mathieu Barré, Adrien Taylor, Alexandre d’Aspremont
; Proceedings of Thirty Third Conference on Learning Theory, PMLR 125:452-478, 2020.

Abstract

In smooth strongly convex optimization, knowledge of the strong convexity parameter is critical for obtaining simple methods with accelerated rates. In this work, we study a class of methods, based on Polyak steps, where this knowledge is substituted by that of the optimal value, $f_*$. We first show slightly improved convergence bounds than previously known for the classical case of simple gradient descent with Polyak steps, we then derive an accelerated gradient method with Polyak steps and momentum, along with convergence guarantees.

Cite this Paper


BibTeX
@InProceedings{pmlr-v125-barre20a, title = {Complexity Guarantees for Polyak Steps with Momentum}, author = {Barr\'e, Mathieu and Taylor, Adrien and d'Aspremont, Alexandre}, pages = {452--478}, year = {2020}, editor = {Jacob Abernethy and Shivani Agarwal}, volume = {125}, series = {Proceedings of Machine Learning Research}, address = {}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v125/barre20a/barre20a.pdf}, url = {http://proceedings.mlr.press/v125/barre20a.html}, abstract = { In smooth strongly convex optimization, knowledge of the strong convexity parameter is critical for obtaining simple methods with accelerated rates. In this work, we study a class of methods, based on Polyak steps, where this knowledge is substituted by that of the optimal value, $f_*$. We first show slightly improved convergence bounds than previously known for the classical case of simple gradient descent with Polyak steps, we then derive an accelerated gradient method with Polyak steps and momentum, along with convergence guarantees.} }
Endnote
%0 Conference Paper %T Complexity Guarantees for Polyak Steps with Momentum %A Mathieu Barré %A Adrien Taylor %A Alexandre d’Aspremont %B Proceedings of Thirty Third Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2020 %E Jacob Abernethy %E Shivani Agarwal %F pmlr-v125-barre20a %I PMLR %J Proceedings of Machine Learning Research %P 452--478 %U http://proceedings.mlr.press %V 125 %W PMLR %X In smooth strongly convex optimization, knowledge of the strong convexity parameter is critical for obtaining simple methods with accelerated rates. In this work, we study a class of methods, based on Polyak steps, where this knowledge is substituted by that of the optimal value, $f_*$. We first show slightly improved convergence bounds than previously known for the classical case of simple gradient descent with Polyak steps, we then derive an accelerated gradient method with Polyak steps and momentum, along with convergence guarantees.
APA
Barré, M., Taylor, A. & d’Aspremont, A.. (2020). Complexity Guarantees for Polyak Steps with Momentum. Proceedings of Thirty Third Conference on Learning Theory, in PMLR 125:452-478

Related Material