[edit]
Robust One-Bit Recovery via ReLU Generative Networks: Near-Optimal Statistical Rate and Global Landscape Analysis
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:7857-7866, 2020.
Abstract
We study the robust one-bit compressed sensing problem whose goal is to design an algorithm that faithfully recovers any sparse target vector θ0∈Rd \emph{uniformly} via m quantized noisy measurements. Specifically, we consider a new framework for this problem where the sparsity is implicitly enforced via mapping a low dimensional representation x0∈Rk through a known n-layer ReLU generative network G:Rk→Rd such that θ0=G(x0). Such a framework poses low-dimensional priors on θ0 without a known sparsity basis. We propose to recover the target G(x0) solving an unconstrained empirical risk minimization (ERM). Under a weak \emph{sub-exponential measurement assumption}, we establish a joint statistical and computational analysis. In particular, we prove that the ERM estimator in this new framework achieves a statistical rate of m=˜O(knlogd/ε2) recovering any G(x0) uniformly up to an error ε. When the network is shallow (i.e., n is small), we show this rate matches the information-theoretic lower bound up to logarithm factors of ε−1. From the lens of computation, we prove that under proper conditions on the network weights, our proposed empirical risk, despite non-convexity, has no stationary point outside of small neighborhoods around the true representation x0 and its negative multiple; furthermore, we show that the global minimizer of the empirical risk stays within the neighborhood around x0 rather than its negative multiple under further assumptions on weights.