Noisy Riemannian Gradient Descent for Eigenvalue Computation with Application to Inexact Stochastic Recursive Gradient Algorithm

You-Lin Chen, Zhiqiang Xu, Ping Li
Proceedings of The 14th Asian Conference on Machine Learning, PMLR 189:201-216, 2023.

Abstract

We provide a robust convergence analysis of the Riemannian gradient descent algorithm for computing the leading eigenvector of a real symmetric matrix. Our result characterizes the convergence behavior of the algorithm under the noisy updates, where noises can be generated by a stochastic process or could be chosen adversarially. The noisy Riemannian gradient descent has a broad range of applications in machine learning and statistics, e.g., streaming principal component analysis or privacy-preserving spectral analysis. In particular, we demonstrate the usefulness of our convergence bound with a new eigengap-dependent sample complexity of the inexact Riemannian stochastic recursive gradient algorithm, which utilizes mini-batch gradients instead of full gradients in outer loops. Our robust convergence paradigm strictly improves the state-of-the-art sample complexity in terms of the gap dependence.

Cite this Paper


BibTeX
@InProceedings{pmlr-v189-chen23c, title = {Noisy Riemannian Gradient Descent for Eigenvalue Computation with Application to Inexact Stochastic Recursive Gradient Algorithm}, author = {Chen, You-Lin and Xu, Zhiqiang and Li, Ping}, booktitle = {Proceedings of The 14th Asian Conference on Machine Learning}, pages = {201--216}, year = {2023}, editor = {Khan, Emtiyaz and Gonen, Mehmet}, volume = {189}, series = {Proceedings of Machine Learning Research}, month = {12--14 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v189/chen23c/chen23c.pdf}, url = {https://proceedings.mlr.press/v189/chen23c.html}, abstract = {We provide a robust convergence analysis of the Riemannian gradient descent algorithm for computing the leading eigenvector of a real symmetric matrix. Our result characterizes the convergence behavior of the algorithm under the noisy updates, where noises can be generated by a stochastic process or could be chosen adversarially. The noisy Riemannian gradient descent has a broad range of applications in machine learning and statistics, e.g., streaming principal component analysis or privacy-preserving spectral analysis. In particular, we demonstrate the usefulness of our convergence bound with a new eigengap-dependent sample complexity of the inexact Riemannian stochastic recursive gradient algorithm, which utilizes mini-batch gradients instead of full gradients in outer loops. Our robust convergence paradigm strictly improves the state-of-the-art sample complexity in terms of the gap dependence.} }
Endnote
%0 Conference Paper %T Noisy Riemannian Gradient Descent for Eigenvalue Computation with Application to Inexact Stochastic Recursive Gradient Algorithm %A You-Lin Chen %A Zhiqiang Xu %A Ping Li %B Proceedings of The 14th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Emtiyaz Khan %E Mehmet Gonen %F pmlr-v189-chen23c %I PMLR %P 201--216 %U https://proceedings.mlr.press/v189/chen23c.html %V 189 %X We provide a robust convergence analysis of the Riemannian gradient descent algorithm for computing the leading eigenvector of a real symmetric matrix. Our result characterizes the convergence behavior of the algorithm under the noisy updates, where noises can be generated by a stochastic process or could be chosen adversarially. The noisy Riemannian gradient descent has a broad range of applications in machine learning and statistics, e.g., streaming principal component analysis or privacy-preserving spectral analysis. In particular, we demonstrate the usefulness of our convergence bound with a new eigengap-dependent sample complexity of the inexact Riemannian stochastic recursive gradient algorithm, which utilizes mini-batch gradients instead of full gradients in outer loops. Our robust convergence paradigm strictly improves the state-of-the-art sample complexity in terms of the gap dependence.
APA
Chen, Y., Xu, Z. & Li, P.. (2023). Noisy Riemannian Gradient Descent for Eigenvalue Computation with Application to Inexact Stochastic Recursive Gradient Algorithm. Proceedings of The 14th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 189:201-216 Available from https://proceedings.mlr.press/v189/chen23c.html.

Related Material