A Unified Framework for Nonconvex Low-Rank plus Sparse Matrix Recovery

Xiao Zhang, Lingxiao Wang, Quanquan Gu
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1097-1107, 2018.

Abstract

We propose a unified framework to solve general low-rank plus sparse matrix recovery problems based on matrix factorization, which covers a broad family of objective functions satisfying the restricted strong convexity and smoothness conditions. Based on projected gradient descent and the double thresholding operator, our proposed generic algorithm is guaranteed to converge to the unknown low-rank and sparse matrices at a locally linear rate, while matching the best-known robustness guarantee (i.e., tolerance for sparsity). At the core of our theory is a novel structural Lipschitz gradient condition for low-rank plus sparse matrices, which is essential for proving the linear convergence rate of our algorithm, and we believe is of independent interest to prove fast rates for general superposition-structured models. We illustrate the application of our framework through two concrete examples: robust matrix sensing and robust PCA. Empirical experiments corroborate our theory.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-zhang18c, title = {A Unified Framework for Nonconvex Low-Rank plus Sparse Matrix Recovery}, author = {Zhang, Xiao and Wang, Lingxiao and Gu, Quanquan}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {1097--1107}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/zhang18c/zhang18c.pdf}, url = {https://proceedings.mlr.press/v84/zhang18c.html}, abstract = {We propose a unified framework to solve general low-rank plus sparse matrix recovery problems based on matrix factorization, which covers a broad family of objective functions satisfying the restricted strong convexity and smoothness conditions. Based on projected gradient descent and the double thresholding operator, our proposed generic algorithm is guaranteed to converge to the unknown low-rank and sparse matrices at a locally linear rate, while matching the best-known robustness guarantee (i.e., tolerance for sparsity). At the core of our theory is a novel structural Lipschitz gradient condition for low-rank plus sparse matrices, which is essential for proving the linear convergence rate of our algorithm, and we believe is of independent interest to prove fast rates for general superposition-structured models. We illustrate the application of our framework through two concrete examples: robust matrix sensing and robust PCA. Empirical experiments corroborate our theory.} }
Endnote
%0 Conference Paper %T A Unified Framework for Nonconvex Low-Rank plus Sparse Matrix Recovery %A Xiao Zhang %A Lingxiao Wang %A Quanquan Gu %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-zhang18c %I PMLR %P 1097--1107 %U https://proceedings.mlr.press/v84/zhang18c.html %V 84 %X We propose a unified framework to solve general low-rank plus sparse matrix recovery problems based on matrix factorization, which covers a broad family of objective functions satisfying the restricted strong convexity and smoothness conditions. Based on projected gradient descent and the double thresholding operator, our proposed generic algorithm is guaranteed to converge to the unknown low-rank and sparse matrices at a locally linear rate, while matching the best-known robustness guarantee (i.e., tolerance for sparsity). At the core of our theory is a novel structural Lipschitz gradient condition for low-rank plus sparse matrices, which is essential for proving the linear convergence rate of our algorithm, and we believe is of independent interest to prove fast rates for general superposition-structured models. We illustrate the application of our framework through two concrete examples: robust matrix sensing and robust PCA. Empirical experiments corroborate our theory.
APA
Zhang, X., Wang, L. & Gu, Q.. (2018). A Unified Framework for Nonconvex Low-Rank plus Sparse Matrix Recovery. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:1097-1107 Available from https://proceedings.mlr.press/v84/zhang18c.html.

Related Material