Riemannian Stochastic Recursive Gradient Algorithm
[edit]
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:25162524, 2018.
Abstract
Stochastic variance reduction algorithms have recently become popular for minimizing the average of a large, but finite number of loss functions on a Riemannian manifold. The present paper proposes a Riemannian stochastic recursive gradient algorithm (RSRG), which does not require the inverse of retraction between two distant iterates on the manifold. Convergence analyses of RSRG are performed on both retractionconvex and nonconvex functions under computationally efficient retraction and vector transport operations. The key challenge is analysis of the influence of vector transport along the retraction curve. Numerical evaluations reveal that RSRG competes well with stateoftheart Riemannian batch and stochastic gradient algorithms.
Related Material


