How Close Are the Eigenvectors of the Sample and Actual Covariance Matrices?
[edit]
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:22282237, 2017.
Abstract
How many samples are sufficient to guarantee that the eigenvectors of the sample covariance matrix are close to those of the actual covariance matrix? For a wide family of distributions, including distributions with finite second moment and subgaussian distributions supported in a centered Euclidean ball, we prove that the inner product between eigenvectors of the sample and actual covariance matrices decreases proportionally to the respective eigenvalue distance and the number of samples. Our findings imply nonasymptotic concentration bounds for eigenvectors and eigenvalues and carry strong consequences for the nonasymptotic analysis of PCA and its applications. For instance, they provide conditions for separating components estimated from $O(1)$ samples and show that even few samples can be sufficient to perform dimensionality reduction, especially for lowrank covariances.
Related Material


