Subspace Embedding and Linear Regression with Orlicz Norm
[edit]
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:224233, 2018.
Abstract
We consider a generalization of the classic linear regression problem to the case when the loss is an Orlicz norm. An Orlicz norm is parameterized by a nonnegative convex function G: R_+  > R_+ with G(0) = 0: the Orlicz norm of a ndimensional vector x is defined as x_G = inf{ alpha > 0  sum_{i = 1}^n G( x_i / alpha ) < = 1 }. We consider the cases where the function G grows subquadratically. Our main result is based on a new oblivious embedding which embeds the column space of a given nxd matrix A with Orlicz norm into a lower dimensional space with L2 norm. Specifically, we show how to efficiently find an mxn embedding matrix S (m < n), such that for every ddimensional vector x, we have Omega(1/(d log n)) Ax_G < = SAx_2 < = O(d^2 log n) Ax_G. By applying this subspace embedding technique, we show an approximation algorithm for the regression problem min_x Axb_G, up to a O( d log^2 n ) factor. As a further application of our techniques, we show how to also use them to improve on the algorithm for the Lp low rank matrix approximation problem for 1 < = p < 2.
Related Material


