[edit]
Robust Matrix Completion from Quantized Observations
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:397-407, 2019.
Abstract
1-bit matrix completion refers to the problem of recovering a real-valued low-rank matrix from a small fraction of its sign patterns. In many real-world applications, however, the observations are not only highly quantized, but also grossly corrupted. In this work, we consider the noisy statistical model where each observed entry can be flipped with some probability after quantization. We propose a simple maximum likelihood estimator which is shown to be robust to the sign flipping noise. In particular, we prove an upper bound on the statistical error, showing that with overwhelming probability $n = O(poly(1-2E[\tau])^{-2} rd \log d)$ samples are sufficient for accurate recovery, where $r$ and $d$ are the rank and dimension of the underlying matrix respectively, and tau in $[0, 1/2)$ is a random variable that parameterizes the sign flipping noise. Furthermore, a lower bound is established showing that the obtained sample complexity is near-optimal for prevalent statistical models. Finally, we substantiate our theoretical findings with a comprehensive study on synthetic and realistic data sets, and demonstrate the state-of-the-art performance.