Universal Matrix Completion

Srinadh Bhojanapalli, Prateek Jain
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1881-1889, 2014.

Abstract

The problem of low-rank matrix completion has recently generated a lot of interest leading to several results that offer exact solutions to the problem. However, in order to do so, these methods make assumptions that can be quite restrictive in practice. More specifically, the methods assume that: a) the observed indices are sampled uniformly at random, and b) for every new matrix, the observed indices are sampled \emphafresh. In this work, we address these issues by providing a universal recovery guarantee for matrix completion that works for a variety of sampling schemes. In particular, we show that if the set of sampled indices come from the edges of a bipartite graph with large spectral gap (i.e. gap between the first and the second singular value), then the nuclear norm minimization based method exactly recovers all low-rank matrices that satisfy certain incoherence properties.Moreover, we also show that under certain stricter incoherence conditions, O(nr^2) uniformly sampled entries are enough to recover any rank-r n\times n matrix, in contrast to the O(nr\log n) sample complexity required by other matrix completion algorithms as well as existing analyses of the nuclear norm method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-bhojanapalli14, title = {Universal Matrix Completion}, author = {Bhojanapalli, Srinadh and Jain, Prateek}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {1881--1889}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/bhojanapalli14.pdf}, url = {https://proceedings.mlr.press/v32/bhojanapalli14.html}, abstract = {The problem of low-rank matrix completion has recently generated a lot of interest leading to several results that offer exact solutions to the problem. However, in order to do so, these methods make assumptions that can be quite restrictive in practice. More specifically, the methods assume that: a) the observed indices are sampled uniformly at random, and b) for every new matrix, the observed indices are sampled \emphafresh. In this work, we address these issues by providing a universal recovery guarantee for matrix completion that works for a variety of sampling schemes. In particular, we show that if the set of sampled indices come from the edges of a bipartite graph with large spectral gap (i.e. gap between the first and the second singular value), then the nuclear norm minimization based method exactly recovers all low-rank matrices that satisfy certain incoherence properties.Moreover, we also show that under certain stricter incoherence conditions, O(nr^2) uniformly sampled entries are enough to recover any rank-r n\times n matrix, in contrast to the O(nr\log n) sample complexity required by other matrix completion algorithms as well as existing analyses of the nuclear norm method.} }
Endnote
%0 Conference Paper %T Universal Matrix Completion %A Srinadh Bhojanapalli %A Prateek Jain %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-bhojanapalli14 %I PMLR %P 1881--1889 %U https://proceedings.mlr.press/v32/bhojanapalli14.html %V 32 %N 2 %X The problem of low-rank matrix completion has recently generated a lot of interest leading to several results that offer exact solutions to the problem. However, in order to do so, these methods make assumptions that can be quite restrictive in practice. More specifically, the methods assume that: a) the observed indices are sampled uniformly at random, and b) for every new matrix, the observed indices are sampled \emphafresh. In this work, we address these issues by providing a universal recovery guarantee for matrix completion that works for a variety of sampling schemes. In particular, we show that if the set of sampled indices come from the edges of a bipartite graph with large spectral gap (i.e. gap between the first and the second singular value), then the nuclear norm minimization based method exactly recovers all low-rank matrices that satisfy certain incoherence properties.Moreover, we also show that under certain stricter incoherence conditions, O(nr^2) uniformly sampled entries are enough to recover any rank-r n\times n matrix, in contrast to the O(nr\log n) sample complexity required by other matrix completion algorithms as well as existing analyses of the nuclear norm method.
RIS
TY - CPAPER TI - Universal Matrix Completion AU - Srinadh Bhojanapalli AU - Prateek Jain BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-bhojanapalli14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 1881 EP - 1889 L1 - http://proceedings.mlr.press/v32/bhojanapalli14.pdf UR - https://proceedings.mlr.press/v32/bhojanapalli14.html AB - The problem of low-rank matrix completion has recently generated a lot of interest leading to several results that offer exact solutions to the problem. However, in order to do so, these methods make assumptions that can be quite restrictive in practice. More specifically, the methods assume that: a) the observed indices are sampled uniformly at random, and b) for every new matrix, the observed indices are sampled \emphafresh. In this work, we address these issues by providing a universal recovery guarantee for matrix completion that works for a variety of sampling schemes. In particular, we show that if the set of sampled indices come from the edges of a bipartite graph with large spectral gap (i.e. gap between the first and the second singular value), then the nuclear norm minimization based method exactly recovers all low-rank matrices that satisfy certain incoherence properties.Moreover, we also show that under certain stricter incoherence conditions, O(nr^2) uniformly sampled entries are enough to recover any rank-r n\times n matrix, in contrast to the O(nr\log n) sample complexity required by other matrix completion algorithms as well as existing analyses of the nuclear norm method. ER -
APA
Bhojanapalli, S. & Jain, P.. (2014). Universal Matrix Completion. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):1881-1889 Available from https://proceedings.mlr.press/v32/bhojanapalli14.html.

Related Material