Matrix Eigen-decomposition via Doubly Stochastic Riemannian Optimization

Zhiqiang Xu, Peilin Zhao, Jianneng Cao, Xiaoli Li
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1660-1669, 2016.

Abstract

Matrix eigen-decomposition is a classic and long-standing problem that plays a fundamental role in scientific computing and machine learning. Despite some existing algorithms for this inherently non-convex problem, the study remains inadequate for the need of large data nowadays. To address this gap, we propose a Doubly Stochastic Riemannian Gradient EIGenSolver, DSRG-EIGS, where the double stochasticity comes from the generalization of the stochastic Euclidean gradient ascent and the stochastic Euclidean coordinate ascent to Riemannian manifolds. As a result, it induces a greatly reduced complexity per iteration, enables the algorithm to completely avoid the matrix inversion, and consequently makes it well-suited to large-scale applications. We theoretically analyze its convergence properties and empirically validate it on real-world datasets. Encouraging experimental results demonstrate its advantages over the deterministic counterparts.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-xub16, title = {Matrix Eigen-decomposition via Doubly Stochastic Riemannian Optimization}, author = {Xu, Zhiqiang and Zhao, Peilin and Cao, Jianneng and Li, Xiaoli}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1660--1669}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/xub16.pdf}, url = { http://proceedings.mlr.press/v48/xub16.html }, abstract = {Matrix eigen-decomposition is a classic and long-standing problem that plays a fundamental role in scientific computing and machine learning. Despite some existing algorithms for this inherently non-convex problem, the study remains inadequate for the need of large data nowadays. To address this gap, we propose a Doubly Stochastic Riemannian Gradient EIGenSolver, DSRG-EIGS, where the double stochasticity comes from the generalization of the stochastic Euclidean gradient ascent and the stochastic Euclidean coordinate ascent to Riemannian manifolds. As a result, it induces a greatly reduced complexity per iteration, enables the algorithm to completely avoid the matrix inversion, and consequently makes it well-suited to large-scale applications. We theoretically analyze its convergence properties and empirically validate it on real-world datasets. Encouraging experimental results demonstrate its advantages over the deterministic counterparts.} }
Endnote
%0 Conference Paper %T Matrix Eigen-decomposition via Doubly Stochastic Riemannian Optimization %A Zhiqiang Xu %A Peilin Zhao %A Jianneng Cao %A Xiaoli Li %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-xub16 %I PMLR %P 1660--1669 %U http://proceedings.mlr.press/v48/xub16.html %V 48 %X Matrix eigen-decomposition is a classic and long-standing problem that plays a fundamental role in scientific computing and machine learning. Despite some existing algorithms for this inherently non-convex problem, the study remains inadequate for the need of large data nowadays. To address this gap, we propose a Doubly Stochastic Riemannian Gradient EIGenSolver, DSRG-EIGS, where the double stochasticity comes from the generalization of the stochastic Euclidean gradient ascent and the stochastic Euclidean coordinate ascent to Riemannian manifolds. As a result, it induces a greatly reduced complexity per iteration, enables the algorithm to completely avoid the matrix inversion, and consequently makes it well-suited to large-scale applications. We theoretically analyze its convergence properties and empirically validate it on real-world datasets. Encouraging experimental results demonstrate its advantages over the deterministic counterparts.
RIS
TY - CPAPER TI - Matrix Eigen-decomposition via Doubly Stochastic Riemannian Optimization AU - Zhiqiang Xu AU - Peilin Zhao AU - Jianneng Cao AU - Xiaoli Li BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-xub16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1660 EP - 1669 L1 - http://proceedings.mlr.press/v48/xub16.pdf UR - http://proceedings.mlr.press/v48/xub16.html AB - Matrix eigen-decomposition is a classic and long-standing problem that plays a fundamental role in scientific computing and machine learning. Despite some existing algorithms for this inherently non-convex problem, the study remains inadequate for the need of large data nowadays. To address this gap, we propose a Doubly Stochastic Riemannian Gradient EIGenSolver, DSRG-EIGS, where the double stochasticity comes from the generalization of the stochastic Euclidean gradient ascent and the stochastic Euclidean coordinate ascent to Riemannian manifolds. As a result, it induces a greatly reduced complexity per iteration, enables the algorithm to completely avoid the matrix inversion, and consequently makes it well-suited to large-scale applications. We theoretically analyze its convergence properties and empirically validate it on real-world datasets. Encouraging experimental results demonstrate its advantages over the deterministic counterparts. ER -
APA
Xu, Z., Zhao, P., Cao, J. & Li, X.. (2016). Matrix Eigen-decomposition via Doubly Stochastic Riemannian Optimization. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1660-1669 Available from http://proceedings.mlr.press/v48/xub16.html .

Related Material