On the Accelerated Noise-Tolerant Power Method

Zhiqiang Xu
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:7147-7175, 2023.

Abstract

We revisit the acceleration of the noise-tolerant power method for which, despite previous studies, the results remain unsatisfactory as they are either wrong or suboptimal, also lacking generality. In this work, we present a simple yet general and optimal analysis via noise-corrupted Chebyshev polynomials, which allows a larger iteration rank $p$ than the target rank $k$, requires less noise conditions in a new form, and achieves the optimal iteration complexity $\Theta\left(\sqrt{\frac{\lambda_{k}-\lambda_{q+1}}{\lambda_{k}}}\log\frac{1}{\epsilon}\right)$ for some $q$ satisfying $k\leq q\leq p$ in a certain regime of the momentum parameter. Interestingly, it shows dynamic dependence of the noise tolerance on the spectral gap, i.e., from linear at the beginning to square-root near convergence, while remaining commensurate with the previous in terms of overall tolerance. We relate our new form of noise norm conditions to the existing trigonometric one, which enables an improved analysis of generalized eigenspace computation and canonical correlation analysis. We conduct an extensive experimental study to showcase the great performance of the considered algorithm with a larger iteration rank $p>k$ across different applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-xu23g, title = {On the Accelerated Noise-Tolerant Power Method}, author = {Xu, Zhiqiang}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {7147--7175}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/xu23g/xu23g.pdf}, url = {https://proceedings.mlr.press/v206/xu23g.html}, abstract = {We revisit the acceleration of the noise-tolerant power method for which, despite previous studies, the results remain unsatisfactory as they are either wrong or suboptimal, also lacking generality. In this work, we present a simple yet general and optimal analysis via noise-corrupted Chebyshev polynomials, which allows a larger iteration rank $p$ than the target rank $k$, requires less noise conditions in a new form, and achieves the optimal iteration complexity $\Theta\left(\sqrt{\frac{\lambda_{k}-\lambda_{q+1}}{\lambda_{k}}}\log\frac{1}{\epsilon}\right)$ for some $q$ satisfying $k\leq q\leq p$ in a certain regime of the momentum parameter. Interestingly, it shows dynamic dependence of the noise tolerance on the spectral gap, i.e., from linear at the beginning to square-root near convergence, while remaining commensurate with the previous in terms of overall tolerance. We relate our new form of noise norm conditions to the existing trigonometric one, which enables an improved analysis of generalized eigenspace computation and canonical correlation analysis. We conduct an extensive experimental study to showcase the great performance of the considered algorithm with a larger iteration rank $p>k$ across different applications.} }
Endnote
%0 Conference Paper %T On the Accelerated Noise-Tolerant Power Method %A Zhiqiang Xu %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-xu23g %I PMLR %P 7147--7175 %U https://proceedings.mlr.press/v206/xu23g.html %V 206 %X We revisit the acceleration of the noise-tolerant power method for which, despite previous studies, the results remain unsatisfactory as they are either wrong or suboptimal, also lacking generality. In this work, we present a simple yet general and optimal analysis via noise-corrupted Chebyshev polynomials, which allows a larger iteration rank $p$ than the target rank $k$, requires less noise conditions in a new form, and achieves the optimal iteration complexity $\Theta\left(\sqrt{\frac{\lambda_{k}-\lambda_{q+1}}{\lambda_{k}}}\log\frac{1}{\epsilon}\right)$ for some $q$ satisfying $k\leq q\leq p$ in a certain regime of the momentum parameter. Interestingly, it shows dynamic dependence of the noise tolerance on the spectral gap, i.e., from linear at the beginning to square-root near convergence, while remaining commensurate with the previous in terms of overall tolerance. We relate our new form of noise norm conditions to the existing trigonometric one, which enables an improved analysis of generalized eigenspace computation and canonical correlation analysis. We conduct an extensive experimental study to showcase the great performance of the considered algorithm with a larger iteration rank $p>k$ across different applications.
APA
Xu, Z.. (2023). On the Accelerated Noise-Tolerant Power Method. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:7147-7175 Available from https://proceedings.mlr.press/v206/xu23g.html.

Related Material