[edit]
On the Accelerated Noise-Tolerant Power Method
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:7147-7175, 2023.
Abstract
We revisit the acceleration of the noise-tolerant power method for which, despite previous studies, the results remain unsatisfactory as they are either wrong or suboptimal, also lacking generality. In this work, we present a simple yet general and optimal analysis via noise-corrupted Chebyshev polynomials, which allows a larger iteration rank $p$ than the target rank $k$, requires less noise conditions in a new form, and achieves the optimal iteration complexity $\Theta\left(\sqrt{\frac{\lambda_{k}-\lambda_{q+1}}{\lambda_{k}}}\log\frac{1}{\epsilon}\right)$ for some $q$ satisfying $k\leq q\leq p$ in a certain regime of the momentum parameter. Interestingly, it shows dynamic dependence of the noise tolerance on the spectral gap, i.e., from linear at the beginning to square-root near convergence, while remaining commensurate with the previous in terms of overall tolerance. We relate our new form of noise norm conditions to the existing trigonometric one, which enables an improved analysis of generalized eigenspace computation and canonical correlation analysis. We conduct an extensive experimental study to showcase the great performance of the considered algorithm with a larger iteration rank $p>k$ across different applications.