[edit]
Noisy Riemannian Gradient Descent for Eigenvalue Computation with Application to Inexact Stochastic Recursive Gradient Algorithm
Proceedings of The 14th Asian Conference on Machine
Learning, PMLR 189:201-216, 2023.
Abstract
We provide a robust convergence analysis of the
Riemannian gradient descent algorithm for computing
the leading eigenvector of a real symmetric
matrix. Our result characterizes the convergence
behavior of the algorithm under the noisy updates,
where noises can be generated by a stochastic
process or could be chosen adversarially. The noisy
Riemannian gradient descent has a broad range of
applications in machine learning and statistics,
e.g., streaming principal component analysis or
privacy-preserving spectral analysis. In particular,
we demonstrate the usefulness of our convergence
bound with a new eigengap-dependent sample
complexity of the inexact Riemannian stochastic
recursive gradient algorithm, which utilizes
mini-batch gradients instead of full gradients in
outer loops. Our robust convergence paradigm
strictly improves the state-of-the-art sample
complexity in terms of the gap dependence.