From Nesterov’s Estimate Sequence to Riemannian Acceleration

Kwangjun Ahn, Suvrit Sra
Proceedings of Thirty Third Conference on Learning Theory, PMLR 125:84-118, 2020.

Abstract

We propose the first global accelerated gradient method for Riemannian manifolds. Toward establishing our results, we revisit Nesterov’s estimate sequence technique and develop a conceptually simple alternative from first principles. We then extend our analysis to Riemannian acceleration, localizing the key difficulty into “metric distortion.” We control this distortion via a novel geometric inequality, which enables us to formulate and analyze global Riemannian acceleration.

Cite this Paper


BibTeX
@InProceedings{pmlr-v125-ahn20a, title = {From Nesterov’s Estimate Sequence to Riemannian Acceleration}, author = {Ahn, Kwangjun and Sra, Suvrit}, booktitle = {Proceedings of Thirty Third Conference on Learning Theory}, pages = {84--118}, year = {2020}, editor = {Abernethy, Jacob and Agarwal, Shivani}, volume = {125}, series = {Proceedings of Machine Learning Research}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v125/ahn20a/ahn20a.pdf}, url = {https://proceedings.mlr.press/v125/ahn20a.html}, abstract = { We propose the first global accelerated gradient method for Riemannian manifolds. Toward establishing our results, we revisit Nesterov’s estimate sequence technique and develop a conceptually simple alternative from first principles. We then extend our analysis to Riemannian acceleration, localizing the key difficulty into “metric distortion.” We control this distortion via a novel geometric inequality, which enables us to formulate and analyze global Riemannian acceleration.} }
Endnote
%0 Conference Paper %T From Nesterov’s Estimate Sequence to Riemannian Acceleration %A Kwangjun Ahn %A Suvrit Sra %B Proceedings of Thirty Third Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2020 %E Jacob Abernethy %E Shivani Agarwal %F pmlr-v125-ahn20a %I PMLR %P 84--118 %U https://proceedings.mlr.press/v125/ahn20a.html %V 125 %X We propose the first global accelerated gradient method for Riemannian manifolds. Toward establishing our results, we revisit Nesterov’s estimate sequence technique and develop a conceptually simple alternative from first principles. We then extend our analysis to Riemannian acceleration, localizing the key difficulty into “metric distortion.” We control this distortion via a novel geometric inequality, which enables us to formulate and analyze global Riemannian acceleration.
APA
Ahn, K. & Sra, S.. (2020). From Nesterov’s Estimate Sequence to Riemannian Acceleration. Proceedings of Thirty Third Conference on Learning Theory, in Proceedings of Machine Learning Research 125:84-118 Available from https://proceedings.mlr.press/v125/ahn20a.html.

Related Material