Accelerated Gradient Methods for Geodesically Convex Optimization: Tractable Algorithms and Convergence Analysis

Jungbin Kim, Insoon Yang
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:11255-11282, 2022.

Abstract

We propose computationally tractable accelerated first-order methods for Riemannian optimization, extending the Nesterov accelerated gradient (NAG) method. For both geodesically convex and geodesically strongly convex objective functions, our algorithms are shown to have the same iteration complexities as those for the NAG method on Euclidean spaces, under only standard assumptions. To the best of our knowledge, the proposed scheme is the first fully accelerated method for geodesically convex optimization problems. Our convergence analysis makes use of novel metric distortion lemmas as well as carefully designed potential functions. A connection with the continuous-time dynamics for modeling Riemannian acceleration in (Alimisis et al., 2020) is also identified by letting the stepsize tend to zero. We validate our theoretical results through numerical experiments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-kim22k, title = {Accelerated Gradient Methods for Geodesically Convex Optimization: Tractable Algorithms and Convergence Analysis}, author = {Kim, Jungbin and Yang, Insoon}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {11255--11282}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/kim22k/kim22k.pdf}, url = {https://proceedings.mlr.press/v162/kim22k.html}, abstract = {We propose computationally tractable accelerated first-order methods for Riemannian optimization, extending the Nesterov accelerated gradient (NAG) method. For both geodesically convex and geodesically strongly convex objective functions, our algorithms are shown to have the same iteration complexities as those for the NAG method on Euclidean spaces, under only standard assumptions. To the best of our knowledge, the proposed scheme is the first fully accelerated method for geodesically convex optimization problems. Our convergence analysis makes use of novel metric distortion lemmas as well as carefully designed potential functions. A connection with the continuous-time dynamics for modeling Riemannian acceleration in (Alimisis et al., 2020) is also identified by letting the stepsize tend to zero. We validate our theoretical results through numerical experiments.} }
Endnote
%0 Conference Paper %T Accelerated Gradient Methods for Geodesically Convex Optimization: Tractable Algorithms and Convergence Analysis %A Jungbin Kim %A Insoon Yang %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-kim22k %I PMLR %P 11255--11282 %U https://proceedings.mlr.press/v162/kim22k.html %V 162 %X We propose computationally tractable accelerated first-order methods for Riemannian optimization, extending the Nesterov accelerated gradient (NAG) method. For both geodesically convex and geodesically strongly convex objective functions, our algorithms are shown to have the same iteration complexities as those for the NAG method on Euclidean spaces, under only standard assumptions. To the best of our knowledge, the proposed scheme is the first fully accelerated method for geodesically convex optimization problems. Our convergence analysis makes use of novel metric distortion lemmas as well as carefully designed potential functions. A connection with the continuous-time dynamics for modeling Riemannian acceleration in (Alimisis et al., 2020) is also identified by letting the stepsize tend to zero. We validate our theoretical results through numerical experiments.
APA
Kim, J. & Yang, I.. (2022). Accelerated Gradient Methods for Geodesically Convex Optimization: Tractable Algorithms and Convergence Analysis. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:11255-11282 Available from https://proceedings.mlr.press/v162/kim22k.html.

Related Material