Space lower bounds for linear prediction in the streaming model
[edit]
Proceedings of the ThirtySecond Conference on Learning Theory, PMLR 99:929954, 2019.
Abstract
We show that fundamental learning tasks, such as finding an approximate linear separator or linear regression, require memory at least \emph{quadratic} in the dimension, in a natural streaming setting. This implies that such problems cannot be solved (at least in this setting) by scalable memoryefficient streaming algorithms. Our results build on a memory lower bound for a simple linearalgebraic problem – finding approximate null vectors – and utilize the estimates on the packing of the Grassmannian, the manifold of all linear subspaces of fixed dimension.
Related Material


