An Estimate Sequence for Geodesically Convex Optimization

Hongyi Zhang, Suvrit Sra
Proceedings of the 31st Conference On Learning Theory, PMLR 75:1703-1723, 2018.

Abstract

We propose a Riemannian version of Nesterov’s Accelerated Gradient algorithm (\textsc{Ragd}), and show that for \emph{geodesically} smooth and strongly convex problems, within a neighborhood of the minimizer whose radius depends on the condition number as well as the sectional curvature of the manifold, \textsc{Ragd} converges to the minimizer with acceleration. Unlike the algorithm in (Liu et al., 2017) that requires the exact solution to a nonlinear equation which in turn may be intractable, our algorithm is constructive and computationally tractable. Our proof exploits a new estimate sequence and a novel bound on the nonlinear metric distortion, both ideas may be of independent interest.

Cite this Paper


BibTeX
@InProceedings{pmlr-v75-zhang18a, title = {An Estimate Sequence for Geodesically Convex Optimization}, author = {Zhang, Hongyi and Sra, Suvrit}, booktitle = {Proceedings of the 31st Conference On Learning Theory}, pages = {1703--1723}, year = {2018}, editor = {Bubeck, Sébastien and Perchet, Vianney and Rigollet, Philippe}, volume = {75}, series = {Proceedings of Machine Learning Research}, month = {06--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v75/zhang18a/zhang18a.pdf}, url = {https://proceedings.mlr.press/v75/zhang18a.html}, abstract = {We propose a Riemannian version of Nesterov’s Accelerated Gradient algorithm (\textsc{Ragd}), and show that for \emph{geodesically} smooth and strongly convex problems, within a neighborhood of the minimizer whose radius depends on the condition number as well as the sectional curvature of the manifold, \textsc{Ragd} converges to the minimizer with acceleration. Unlike the algorithm in (Liu et al., 2017) that requires the exact solution to a nonlinear equation which in turn may be intractable, our algorithm is constructive and computationally tractable. Our proof exploits a new estimate sequence and a novel bound on the nonlinear metric distortion, both ideas may be of independent interest.} }
Endnote
%0 Conference Paper %T An Estimate Sequence for Geodesically Convex Optimization %A Hongyi Zhang %A Suvrit Sra %B Proceedings of the 31st Conference On Learning Theory %C Proceedings of Machine Learning Research %D 2018 %E Sébastien Bubeck %E Vianney Perchet %E Philippe Rigollet %F pmlr-v75-zhang18a %I PMLR %P 1703--1723 %U https://proceedings.mlr.press/v75/zhang18a.html %V 75 %X We propose a Riemannian version of Nesterov’s Accelerated Gradient algorithm (\textsc{Ragd}), and show that for \emph{geodesically} smooth and strongly convex problems, within a neighborhood of the minimizer whose radius depends on the condition number as well as the sectional curvature of the manifold, \textsc{Ragd} converges to the minimizer with acceleration. Unlike the algorithm in (Liu et al., 2017) that requires the exact solution to a nonlinear equation which in turn may be intractable, our algorithm is constructive and computationally tractable. Our proof exploits a new estimate sequence and a novel bound on the nonlinear metric distortion, both ideas may be of independent interest.
APA
Zhang, H. & Sra, S.. (2018). An Estimate Sequence for Geodesically Convex Optimization. Proceedings of the 31st Conference On Learning Theory, in Proceedings of Machine Learning Research 75:1703-1723 Available from https://proceedings.mlr.press/v75/zhang18a.html.

Related Material