Faster FirstOrder Methods for Stochastic NonConvex Optimization on Riemannian Manifolds
[edit]
Proceedings of Machine Learning Research, PMLR 89:138147, 2019.
Abstract
SPIDER (Stochastic Path Integrated Differential EstimatoR) is an efficient gradient estimation technique developed for nonconvex stochastic optimization. Although having been shown to attain nearly optimal computational complexity bounds, the SPIDERtype methods are limited to linear metric spaces. In this paper, we introduce the Riemannian SPIDER (RSPIDER) method as a novel nonlinearmetric extension of SPIDER for efficient nonconvex optimization on Riemannian manifolds. We prove that for finitesum problems with $n$ components, RSPIDER converges to an $\epsilon$accuracy stationary point within $\mathcal{O}\big(\min\big(n+\frac{\sqrt{n}}{\epsilon^2},\frac{1}{\epsilon^3}\big)\big)$ stochastic gradient evaluations, which is sharper in magnitude than the prior Riemannian firstorder methods. For online optimization, RSPIDER is shown to converge with $\mathcal{O}\big(\frac{1}{\epsilon^3}\big)$ complexity which is, to the best of our knowledge, the first nonasymptotic result for online Riemannian optimization. Especially, for gradient dominated functions, we further develop a variant of RSPIDER and prove its linear convergence rate. Numerical results demonstrate the computational efficiency of the proposed methods.
Related Material


