Robust Matrix Completion from Quantized Observations
[edit]
Proceedings of Machine Learning Research, PMLR 89:397407, 2019.
Abstract
1bit matrix completion refers to the problem of recovering a realvalued lowrank matrix from a small fraction of its sign patterns. In many realworld applications, however, the observations are not only highly quantized, but also grossly corrupted. In this work, we consider the noisy statistical model where each observed entry can be flipped with some probability after quantization. We propose a simple maximum likelihood estimator which is shown to be robust to the sign flipping noise. In particular, we prove an upper bound on the statistical error, showing that with overwhelming probability $n = O(poly(12E[\tau])^{2} rd \log d)$ samples are sufficient for accurate recovery, where $r$ and $d$ are the rank and dimension of the underlying matrix respectively, and tau in $[0, 1/2)$ is a random variable that parameterizes the sign flipping noise. Furthermore, a lower bound is established showing that the obtained sample complexity is nearoptimal for prevalent statistical models. Finally, we substantiate our theoretical findings with a comprehensive study on synthetic and realistic data sets, and demonstrate the stateoftheart performance.
Related Material


