Momentum Improves Optimization on Riemannian Manifolds

Foivos Alimisis, Antonio Orvieto, Gary Becigneul, Aurelien Lucchi
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:1351-1359, 2021.

Abstract

We develop a new Riemannian descent algorithm that relies on momentum to improve over existing first-order methods for geodesically convex optimization. In contrast, accelerated convergence rates proved in prior work have only been shown to hold for geodesically strongly-convex objective functions. We further extend our algorithm to geodesically weakly-quasi-convex objectives. Our proofs of convergence rely on a novel estimate sequence that illustrates the dependency of the convergence rate on the curvature of the manifold. We validate our theoretical results empirically on several optimization problems defined on the sphere and on the manifold of positive definite matrices.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-alimisis21a, title = { Momentum Improves Optimization on Riemannian Manifolds }, author = {Alimisis, Foivos and Orvieto, Antonio and Becigneul, Gary and Lucchi, Aurelien}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {1351--1359}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/alimisis21a/alimisis21a.pdf}, url = {https://proceedings.mlr.press/v130/alimisis21a.html}, abstract = { We develop a new Riemannian descent algorithm that relies on momentum to improve over existing first-order methods for geodesically convex optimization. In contrast, accelerated convergence rates proved in prior work have only been shown to hold for geodesically strongly-convex objective functions. We further extend our algorithm to geodesically weakly-quasi-convex objectives. Our proofs of convergence rely on a novel estimate sequence that illustrates the dependency of the convergence rate on the curvature of the manifold. We validate our theoretical results empirically on several optimization problems defined on the sphere and on the manifold of positive definite matrices. } }
Endnote
%0 Conference Paper %T Momentum Improves Optimization on Riemannian Manifolds %A Foivos Alimisis %A Antonio Orvieto %A Gary Becigneul %A Aurelien Lucchi %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-alimisis21a %I PMLR %P 1351--1359 %U https://proceedings.mlr.press/v130/alimisis21a.html %V 130 %X We develop a new Riemannian descent algorithm that relies on momentum to improve over existing first-order methods for geodesically convex optimization. In contrast, accelerated convergence rates proved in prior work have only been shown to hold for geodesically strongly-convex objective functions. We further extend our algorithm to geodesically weakly-quasi-convex objectives. Our proofs of convergence rely on a novel estimate sequence that illustrates the dependency of the convergence rate on the curvature of the manifold. We validate our theoretical results empirically on several optimization problems defined on the sphere and on the manifold of positive definite matrices.
APA
Alimisis, F., Orvieto, A., Becigneul, G. & Lucchi, A.. (2021). Momentum Improves Optimization on Riemannian Manifolds . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:1351-1359 Available from https://proceedings.mlr.press/v130/alimisis21a.html.

Related Material