Riemannian Pursuit for Big Matrix Recovery

Mingkui Tan, Ivor W. Tsang, Li Wang, Bart Vandereycken, Sinno Jialin Pan
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1539-1547, 2014.

Abstract

Low rank matrix recovery is a fundamental task in many real-world applications. The performance of existing methods, however, deteriorates significantly when applied to ill-conditioned or large-scale matrices. In this paper, we therefore propose an efficient method, called Riemannian Pursuit (RP), that aims to address these two problems simultaneously. Our method consists of a sequence of fixed-rank optimization problems. Each subproblem, solved by a nonlinear Riemannian conjugate gradient method, aims to correct the solution in the most important subspace of increasing size. Theoretically, RP converges linearly under mild conditions and experimental results show that it substantially outperforms existing methods when applied to large-scale and ill-conditioned matrices.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-tan14, title = {Riemannian Pursuit for Big Matrix Recovery}, author = {Tan, Mingkui and Tsang, Ivor W. and Wang, Li and Vandereycken, Bart and Pan, Sinno Jialin}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {1539--1547}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/tan14.pdf}, url = {https://proceedings.mlr.press/v32/tan14.html}, abstract = {Low rank matrix recovery is a fundamental task in many real-world applications. The performance of existing methods, however, deteriorates significantly when applied to ill-conditioned or large-scale matrices. In this paper, we therefore propose an efficient method, called Riemannian Pursuit (RP), that aims to address these two problems simultaneously. Our method consists of a sequence of fixed-rank optimization problems. Each subproblem, solved by a nonlinear Riemannian conjugate gradient method, aims to correct the solution in the most important subspace of increasing size. Theoretically, RP converges linearly under mild conditions and experimental results show that it substantially outperforms existing methods when applied to large-scale and ill-conditioned matrices.} }
Endnote
%0 Conference Paper %T Riemannian Pursuit for Big Matrix Recovery %A Mingkui Tan %A Ivor W. Tsang %A Li Wang %A Bart Vandereycken %A Sinno Jialin Pan %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-tan14 %I PMLR %P 1539--1547 %U https://proceedings.mlr.press/v32/tan14.html %V 32 %N 2 %X Low rank matrix recovery is a fundamental task in many real-world applications. The performance of existing methods, however, deteriorates significantly when applied to ill-conditioned or large-scale matrices. In this paper, we therefore propose an efficient method, called Riemannian Pursuit (RP), that aims to address these two problems simultaneously. Our method consists of a sequence of fixed-rank optimization problems. Each subproblem, solved by a nonlinear Riemannian conjugate gradient method, aims to correct the solution in the most important subspace of increasing size. Theoretically, RP converges linearly under mild conditions and experimental results show that it substantially outperforms existing methods when applied to large-scale and ill-conditioned matrices.
RIS
TY - CPAPER TI - Riemannian Pursuit for Big Matrix Recovery AU - Mingkui Tan AU - Ivor W. Tsang AU - Li Wang AU - Bart Vandereycken AU - Sinno Jialin Pan BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-tan14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 1539 EP - 1547 L1 - http://proceedings.mlr.press/v32/tan14.pdf UR - https://proceedings.mlr.press/v32/tan14.html AB - Low rank matrix recovery is a fundamental task in many real-world applications. The performance of existing methods, however, deteriorates significantly when applied to ill-conditioned or large-scale matrices. In this paper, we therefore propose an efficient method, called Riemannian Pursuit (RP), that aims to address these two problems simultaneously. Our method consists of a sequence of fixed-rank optimization problems. Each subproblem, solved by a nonlinear Riemannian conjugate gradient method, aims to correct the solution in the most important subspace of increasing size. Theoretically, RP converges linearly under mild conditions and experimental results show that it substantially outperforms existing methods when applied to large-scale and ill-conditioned matrices. ER -
APA
Tan, M., Tsang, I.W., Wang, L., Vandereycken, B. & Pan, S.J.. (2014). Riemannian Pursuit for Big Matrix Recovery. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):1539-1547 Available from https://proceedings.mlr.press/v32/tan14.html.

Related Material