Faster Principal Component Regression and Stable Matrix Chebyshev Approximation

Zeyuan Allen-Zhu, Yuanzhi Li
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:107-115, 2017.

Abstract

We solve principal component regression (PCR), up to a multiplicative accuracy $1+\gamma$, by reducing the problem to $\tilde{O}(\gamma^{-1})$ black-box calls of ridge regression. Therefore, our algorithm does not require any explicit construction of the top principal components, and is suitable for large-scale PCR instances. In contrast, previous result requires $\tilde{O}(\gamma^{-2})$ such black-box calls. We obtain this result by developing a general stable recurrence formula for matrix Chebyshev polynomials, and a degree-optimal polynomial approximation to the matrix sign function. Our techniques may be of independent interests, especially when designing iterative methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-allen-zhu17c, title = {Faster Principal Component Regression and Stable Matrix {C}hebyshev Approximation}, author = {Zeyuan Allen-Zhu and Yuanzhi Li}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {107--115}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/allen-zhu17c/allen-zhu17c.pdf}, url = {https://proceedings.mlr.press/v70/allen-zhu17c.html}, abstract = {We solve principal component regression (PCR), up to a multiplicative accuracy $1+\gamma$, by reducing the problem to $\tilde{O}(\gamma^{-1})$ black-box calls of ridge regression. Therefore, our algorithm does not require any explicit construction of the top principal components, and is suitable for large-scale PCR instances. In contrast, previous result requires $\tilde{O}(\gamma^{-2})$ such black-box calls. We obtain this result by developing a general stable recurrence formula for matrix Chebyshev polynomials, and a degree-optimal polynomial approximation to the matrix sign function. Our techniques may be of independent interests, especially when designing iterative methods.} }
Endnote
%0 Conference Paper %T Faster Principal Component Regression and Stable Matrix Chebyshev Approximation %A Zeyuan Allen-Zhu %A Yuanzhi Li %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-allen-zhu17c %I PMLR %P 107--115 %U https://proceedings.mlr.press/v70/allen-zhu17c.html %V 70 %X We solve principal component regression (PCR), up to a multiplicative accuracy $1+\gamma$, by reducing the problem to $\tilde{O}(\gamma^{-1})$ black-box calls of ridge regression. Therefore, our algorithm does not require any explicit construction of the top principal components, and is suitable for large-scale PCR instances. In contrast, previous result requires $\tilde{O}(\gamma^{-2})$ such black-box calls. We obtain this result by developing a general stable recurrence formula for matrix Chebyshev polynomials, and a degree-optimal polynomial approximation to the matrix sign function. Our techniques may be of independent interests, especially when designing iterative methods.
APA
Allen-Zhu, Z. & Li, Y.. (2017). Faster Principal Component Regression and Stable Matrix Chebyshev Approximation. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:107-115 Available from https://proceedings.mlr.press/v70/allen-zhu17c.html.

Related Material