Space lower bounds for linear prediction in the streaming model

Yuval Dagan, Gil Kur, Ohad Shamir
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:929-954, 2019.

Abstract

We show that fundamental learning tasks, such as finding an approximate linear separator or linear regression, require memory at least \emph{quadratic} in the dimension, in a natural streaming setting. This implies that such problems cannot be solved (at least in this setting) by scalable memory-efficient streaming algorithms. Our results build on a memory lower bound for a simple linear-algebraic problem – finding approximate null vectors – and utilize the estimates on the packing of the Grassmannian, the manifold of all linear subspaces of fixed dimension.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-dagan19b, title = {Space lower bounds for linear prediction in the streaming model}, author = {Dagan, Yuval and Kur, Gil and Shamir, Ohad}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {929--954}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/dagan19b/dagan19b.pdf}, url = {https://proceedings.mlr.press/v99/dagan19b.html}, abstract = {We show that fundamental learning tasks, such as finding an approximate linear separator or linear regression, require memory at least \emph{quadratic} in the dimension, in a natural streaming setting. This implies that such problems cannot be solved (at least in this setting) by scalable memory-efficient streaming algorithms. Our results build on a memory lower bound for a simple linear-algebraic problem – finding approximate null vectors – and utilize the estimates on the packing of the Grassmannian, the manifold of all linear subspaces of fixed dimension. } }
Endnote
%0 Conference Paper %T Space lower bounds for linear prediction in the streaming model %A Yuval Dagan %A Gil Kur %A Ohad Shamir %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-dagan19b %I PMLR %P 929--954 %U https://proceedings.mlr.press/v99/dagan19b.html %V 99 %X We show that fundamental learning tasks, such as finding an approximate linear separator or linear regression, require memory at least \emph{quadratic} in the dimension, in a natural streaming setting. This implies that such problems cannot be solved (at least in this setting) by scalable memory-efficient streaming algorithms. Our results build on a memory lower bound for a simple linear-algebraic problem – finding approximate null vectors – and utilize the estimates on the packing of the Grassmannian, the manifold of all linear subspaces of fixed dimension.
APA
Dagan, Y., Kur, G. & Shamir, O.. (2019). Space lower bounds for linear prediction in the streaming model. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:929-954 Available from https://proceedings.mlr.press/v99/dagan19b.html.

Related Material