Doubly Accelerated Methods for Faster CCA and Generalized Eigendecomposition

Zeyuan Allen-Zhu, Yuanzhi Li
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:98-106, 2017.

Abstract

We study k-GenEV, the problem of finding the top k generalized eigenvectors, and k-CCA, the problem of finding the top k vectors in canonical-correlation analysis. We propose algorithms LazyEV and LazyCCA to solve the two problems with running times linearly dependent on the input size and on k. Furthermore, our algorithms are doubly-accelerated: our running times depend only on the square root of the matrix condition number, and on the square root of the eigengap. This is the first such result for both k-GenEV or k-CCA. We also provide the first gap-free results, which provide running times that depend on $1/\sqrt{\varepsilon}$ rather than the eigengap.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-allen-zhu17b, title = {Doubly Accelerated Methods for Faster {CCA} and Generalized Eigendecomposition}, author = {Zeyuan Allen-Zhu and Yuanzhi Li}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {98--106}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/allen-zhu17b/allen-zhu17b.pdf}, url = {https://proceedings.mlr.press/v70/allen-zhu17b.html}, abstract = {We study k-GenEV, the problem of finding the top k generalized eigenvectors, and k-CCA, the problem of finding the top k vectors in canonical-correlation analysis. We propose algorithms LazyEV and LazyCCA to solve the two problems with running times linearly dependent on the input size and on k. Furthermore, our algorithms are doubly-accelerated: our running times depend only on the square root of the matrix condition number, and on the square root of the eigengap. This is the first such result for both k-GenEV or k-CCA. We also provide the first gap-free results, which provide running times that depend on $1/\sqrt{\varepsilon}$ rather than the eigengap.} }
Endnote
%0 Conference Paper %T Doubly Accelerated Methods for Faster CCA and Generalized Eigendecomposition %A Zeyuan Allen-Zhu %A Yuanzhi Li %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-allen-zhu17b %I PMLR %P 98--106 %U https://proceedings.mlr.press/v70/allen-zhu17b.html %V 70 %X We study k-GenEV, the problem of finding the top k generalized eigenvectors, and k-CCA, the problem of finding the top k vectors in canonical-correlation analysis. We propose algorithms LazyEV and LazyCCA to solve the two problems with running times linearly dependent on the input size and on k. Furthermore, our algorithms are doubly-accelerated: our running times depend only on the square root of the matrix condition number, and on the square root of the eigengap. This is the first such result for both k-GenEV or k-CCA. We also provide the first gap-free results, which provide running times that depend on $1/\sqrt{\varepsilon}$ rather than the eigengap.
APA
Allen-Zhu, Z. & Li, Y.. (2017). Doubly Accelerated Methods for Faster CCA and Generalized Eigendecomposition. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:98-106 Available from https://proceedings.mlr.press/v70/allen-zhu17b.html.

Related Material