Universal Average-Case Optimality of Polyak Momentum

Damien Scieur, Fabian Pedregosa
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:8565-8572, 2020.

Abstract

Polyak momentum (PM), also known as the heavy-ball method, is a widely used optimization method that enjoys an asymptotic optimal worst-case complexity on quadratic objectives. However, its remarkable empirical success is not fully explained by this optimality, as the worst-case analysis –contrary to the average-case– is not representative of the expected complexity of an algorithm. In this work we establish a novel link between PM and the average-case analysis. Our main contribution is to prove that any optimal average-case method converges in the number of iterations to PM, under mild assumptions. This brings a new perspective on this classical method, showing that PM is asymptotically both worst-case and average-case optimal.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-scieur20a, title = {Universal Asymptotic Optimality of Polyak Momentum}, author = {Scieur, Damien and Pedregosa, Fabian}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {8565--8572}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/scieur20a/scieur20a.pdf}, url = {https://proceedings.mlr.press/v119/scieur20a.html}, abstract = {Polyak momentum (PM), also known as the heavy-ball method, is a widely used optimization method that enjoys an asymptotic optimal worst-case complexity on quadratic objectives. However, its remarkable empirical success is not fully explained by this optimality, as the worst-case analysis –contrary to the average-case– is not representative of the expected complexity of an algorithm. In this work we establish a novel link between PM and the average-case analysis. Our main contribution is to prove that any optimal average-case method converges in the number of iterations to PM, under mild assumptions. This brings a new perspective on this classical method, showing that PM is asymptotically both worst-case and average-case optimal.} }
Endnote
%0 Conference Paper %T Universal Average-Case Optimality of Polyak Momentum %A Damien Scieur %A Fabian Pedregosa %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-scieur20a %I PMLR %P 8565--8572 %U https://proceedings.mlr.press/v119/scieur20a.html %V 119 %X Polyak momentum (PM), also known as the heavy-ball method, is a widely used optimization method that enjoys an asymptotic optimal worst-case complexity on quadratic objectives. However, its remarkable empirical success is not fully explained by this optimality, as the worst-case analysis –contrary to the average-case– is not representative of the expected complexity of an algorithm. In this work we establish a novel link between PM and the average-case analysis. Our main contribution is to prove that any optimal average-case method converges in the number of iterations to PM, under mild assumptions. This brings a new perspective on this classical method, showing that PM is asymptotically both worst-case and average-case optimal.
APA
Scieur, D. & Pedregosa, F.. (2020). Universal Average-Case Optimality of Polyak Momentum. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:8565-8572 Available from https://proceedings.mlr.press/v119/scieur20a.html.

Related Material