Nearly Optimal Robust Matrix Completion

Yeshwanth Cherapanamjeri, Kartik Gupta, Prateek Jain
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:797-805, 2017.

Abstract

In this paper, we consider the problem of Robust Matrix Completion (RMC) where the goal is to recover a low-rank matrix by observing a small number of its entries out of which a few can be arbitrarily corrupted. We propose a simple projected gradient descent-based method to estimate the low-rank matrix that alternately performs a projected gradient descent step and cleans up a few of the corrupted entries using hard-thresholding. Our algorithm solves RMC using nearly optimal number of observations while tolerating a nearly optimal number of corruptions. Our result also implies significant improvement over the existing time complexity bounds for the low-rank matrix completion problem. Finally, an application of our result to the robust PCA problem (low-rank+sparse matrix separation) leads to nearly linear time (in matrix dimensions) algorithm for the same; existing state-of-the-art methods require quadratic time. Our empirical results corroborate our theoretical results and show that even for moderate sized problems, our method for robust PCA is an order of magnitude faster than the existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-cherapanamjeri17a, title = {Nearly Optimal Robust Matrix Completion}, author = {Yeshwanth Cherapanamjeri and Kartik Gupta and Prateek Jain}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {797--805}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/cherapanamjeri17a/cherapanamjeri17a.pdf}, url = {https://proceedings.mlr.press/v70/cherapanamjeri17a.html}, abstract = {In this paper, we consider the problem of Robust Matrix Completion (RMC) where the goal is to recover a low-rank matrix by observing a small number of its entries out of which a few can be arbitrarily corrupted. We propose a simple projected gradient descent-based method to estimate the low-rank matrix that alternately performs a projected gradient descent step and cleans up a few of the corrupted entries using hard-thresholding. Our algorithm solves RMC using nearly optimal number of observations while tolerating a nearly optimal number of corruptions. Our result also implies significant improvement over the existing time complexity bounds for the low-rank matrix completion problem. Finally, an application of our result to the robust PCA problem (low-rank+sparse matrix separation) leads to nearly linear time (in matrix dimensions) algorithm for the same; existing state-of-the-art methods require quadratic time. Our empirical results corroborate our theoretical results and show that even for moderate sized problems, our method for robust PCA is an order of magnitude faster than the existing methods.} }
Endnote
%0 Conference Paper %T Nearly Optimal Robust Matrix Completion %A Yeshwanth Cherapanamjeri %A Kartik Gupta %A Prateek Jain %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-cherapanamjeri17a %I PMLR %P 797--805 %U https://proceedings.mlr.press/v70/cherapanamjeri17a.html %V 70 %X In this paper, we consider the problem of Robust Matrix Completion (RMC) where the goal is to recover a low-rank matrix by observing a small number of its entries out of which a few can be arbitrarily corrupted. We propose a simple projected gradient descent-based method to estimate the low-rank matrix that alternately performs a projected gradient descent step and cleans up a few of the corrupted entries using hard-thresholding. Our algorithm solves RMC using nearly optimal number of observations while tolerating a nearly optimal number of corruptions. Our result also implies significant improvement over the existing time complexity bounds for the low-rank matrix completion problem. Finally, an application of our result to the robust PCA problem (low-rank+sparse matrix separation) leads to nearly linear time (in matrix dimensions) algorithm for the same; existing state-of-the-art methods require quadratic time. Our empirical results corroborate our theoretical results and show that even for moderate sized problems, our method for robust PCA is an order of magnitude faster than the existing methods.
APA
Cherapanamjeri, Y., Gupta, K. & Jain, P.. (2017). Nearly Optimal Robust Matrix Completion. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:797-805 Available from https://proceedings.mlr.press/v70/cherapanamjeri17a.html.

Related Material