Lifting highdimensional nonlinear models with Gaussian regressors
[edit]
Proceedings of Machine Learning Research, PMLR 89:32063215, 2019.
Abstract
We study the problem of recovering a structured signal $\mathbf{x}_0$ from highdimensional data $\mathbf{y}_i=f(\mathbf{a}_i^T\mathbf{x}_0)$ for some nonlinear (and potentially unknown) link function $f$, when the regressors $\mathbf{a}_i$ are iid Gaussian. Brillinger (1982) showed that ordinary leastsquares estimates $\mathbf{x}_0$ up to a constant of proportionality $\mu_\ell$, which depends on $f$. Recently, Plan & Vershynin (2015) extended this result to the highdimensional setting deriving sharp error bounds for the generalized Lasso. Unfortunately, both leastsquares and the Lasso fail to recover $\mathbf{x}_0$ when $\mu_\ell=0$. For example, this includes all even link functions. We resolve this issue by proposing and analyzing an alternative convex recovery method. In a nutshell, our method treats such link functions as if they were linear in a lifted space of higherdimension. Interestingly, our error analysis captures the effect of both the nonlinearity and the problem’s geometry in a few simple summary parameters.
Related Material


