[edit]
An Estimate Sequence for Geodesically Convex Optimization
Proceedings of the 31st Conference On Learning Theory, PMLR 75:1703-1723, 2018.
Abstract
We propose a Riemannian version of Nesterov’s Accelerated Gradient algorithm (\textsc{Ragd}), and show that for \emph{geodesically} smooth and strongly convex problems, within a neighborhood of the minimizer whose radius depends on the condition number as well as the sectional curvature of the manifold, \textsc{Ragd} converges to the minimizer with acceleration. Unlike the algorithm in (Liu et al., 2017) that requires the exact solution to a nonlinear equation which in turn may be intractable, our algorithm is constructive and computationally tractable. Our proof exploits a new estimate sequence and a novel bound on the nonlinear metric distortion, both ideas may be of independent interest.