Robust Matrix Completion from Quantized Observations

Jie Shen, Pranjal Awasthi, Ping Li
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:397-407, 2019.

Abstract

1-bit matrix completion refers to the problem of recovering a real-valued low-rank matrix from a small fraction of its sign patterns. In many real-world applications, however, the observations are not only highly quantized, but also grossly corrupted. In this work, we consider the noisy statistical model where each observed entry can be flipped with some probability after quantization. We propose a simple maximum likelihood estimator which is shown to be robust to the sign flipping noise. In particular, we prove an upper bound on the statistical error, showing that with overwhelming probability $n = O(poly(1-2E[\tau])^{-2} rd \log d)$ samples are sufficient for accurate recovery, where $r$ and $d$ are the rank and dimension of the underlying matrix respectively, and tau in $[0, 1/2)$ is a random variable that parameterizes the sign flipping noise. Furthermore, a lower bound is established showing that the obtained sample complexity is near-optimal for prevalent statistical models. Finally, we substantiate our theoretical findings with a comprehensive study on synthetic and realistic data sets, and demonstrate the state-of-the-art performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-shen19a, title = {Robust Matrix Completion from Quantized Observations}, author = {Shen, Jie and Awasthi, Pranjal and Li, Ping}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {397--407}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/shen19a/shen19a.pdf}, url = {https://proceedings.mlr.press/v89/shen19a.html}, abstract = {1-bit matrix completion refers to the problem of recovering a real-valued low-rank matrix from a small fraction of its sign patterns. In many real-world applications, however, the observations are not only highly quantized, but also grossly corrupted. In this work, we consider the noisy statistical model where each observed entry can be flipped with some probability after quantization. We propose a simple maximum likelihood estimator which is shown to be robust to the sign flipping noise. In particular, we prove an upper bound on the statistical error, showing that with overwhelming probability $n = O(poly(1-2E[\tau])^{-2} rd \log d)$ samples are sufficient for accurate recovery, where $r$ and $d$ are the rank and dimension of the underlying matrix respectively, and tau in $[0, 1/2)$ is a random variable that parameterizes the sign flipping noise. Furthermore, a lower bound is established showing that the obtained sample complexity is near-optimal for prevalent statistical models. Finally, we substantiate our theoretical findings with a comprehensive study on synthetic and realistic data sets, and demonstrate the state-of-the-art performance.} }
Endnote
%0 Conference Paper %T Robust Matrix Completion from Quantized Observations %A Jie Shen %A Pranjal Awasthi %A Ping Li %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-shen19a %I PMLR %P 397--407 %U https://proceedings.mlr.press/v89/shen19a.html %V 89 %X 1-bit matrix completion refers to the problem of recovering a real-valued low-rank matrix from a small fraction of its sign patterns. In many real-world applications, however, the observations are not only highly quantized, but also grossly corrupted. In this work, we consider the noisy statistical model where each observed entry can be flipped with some probability after quantization. We propose a simple maximum likelihood estimator which is shown to be robust to the sign flipping noise. In particular, we prove an upper bound on the statistical error, showing that with overwhelming probability $n = O(poly(1-2E[\tau])^{-2} rd \log d)$ samples are sufficient for accurate recovery, where $r$ and $d$ are the rank and dimension of the underlying matrix respectively, and tau in $[0, 1/2)$ is a random variable that parameterizes the sign flipping noise. Furthermore, a lower bound is established showing that the obtained sample complexity is near-optimal for prevalent statistical models. Finally, we substantiate our theoretical findings with a comprehensive study on synthetic and realistic data sets, and demonstrate the state-of-the-art performance.
APA
Shen, J., Awasthi, P. & Li, P.. (2019). Robust Matrix Completion from Quantized Observations. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:397-407 Available from https://proceedings.mlr.press/v89/shen19a.html.

Related Material