Faster Principal Component Regression and Stable Matrix Chebyshev Approximation
[edit]
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:107115, 2017.
Abstract
We solve principal component regression (PCR), up to a multiplicative accuracy $1+\gamma$, by reducing the problem to $\tilde{O}(\gamma^{1})$ blackbox calls of ridge regression. Therefore, our algorithm does not require any explicit construction of the top principal components, and is suitable for largescale PCR instances. In contrast, previous result requires $\tilde{O}(\gamma^{2})$ such blackbox calls. We obtain this result by developing a general stable recurrence formula for matrix Chebyshev polynomials, and a degreeoptimal polynomial approximation to the matrix sign function. Our techniques may be of independent interests, especially when designing iterative methods.
Related Material


