Mixing Implies Lower Bounds for Space Bounded Learning

Dana Moshkovitz, Michal Moshkovitz
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:1516-1566, 2017.

Abstract

One can learn any hypothesis class H with O(log |H|) labeled examples. Alas, learning with so few examples requires saving the examples in memory, and this requires |X|^(O(log|H|)) memory states, where X is the set of all labeled examples. This motivates the question of how many labeled examples are needed in case the memory is bounded. Previous work showed, using techniques such as linear algebra and Fourier analysis, that parities cannot be learned with bounded memory and less than |H|^(Omega(1)) examples. One might wonder whether a general combinatorial condition exists for unlearnability with bounded memory, as we have with the condition VCdim(H) = Infinity for PAC unlearnability. In this paper we give such a condition. We show that if an hypothesis class H, when viewed as a bipartite graph between hypotheses H and labeled examples X, is mixing, then learning it requires |H|^(Omega(1)) examples under a certain bound on the memory. Note that the class of parities is mixing. Moreover, as an immediate corollary, we get that most hypothesis classes are unlearnable with bounded memory. Our proof technique is combinatorial in nature and very different from previous analyses.

Cite this Paper


BibTeX
@InProceedings{pmlr-v65-moshkovitz17a, title = {Mixing Implies Lower Bounds for Space Bounded Learning}, author = {Moshkovitz, Dana and Moshkovitz, Michal}, booktitle = {Proceedings of the 2017 Conference on Learning Theory}, pages = {1516--1566}, year = {2017}, editor = {Kale, Satyen and Shamir, Ohad}, volume = {65}, series = {Proceedings of Machine Learning Research}, month = {07--10 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v65/moshkovitz17a/moshkovitz17a.pdf}, url = {https://proceedings.mlr.press/v65/moshkovitz17a.html}, abstract = {One can learn any hypothesis class H with O(log |H|) labeled examples. Alas, learning with so few examples requires saving the examples in memory, and this requires |X|^(O(log|H|)) memory states, where X is the set of all labeled examples. This motivates the question of how many labeled examples are needed in case the memory is bounded. Previous work showed, using techniques such as linear algebra and Fourier analysis, that parities cannot be learned with bounded memory and less than |H|^(Omega(1)) examples. One might wonder whether a general combinatorial condition exists for unlearnability with bounded memory, as we have with the condition VCdim(H) = Infinity for PAC unlearnability. In this paper we give such a condition. We show that if an hypothesis class H, when viewed as a bipartite graph between hypotheses H and labeled examples X, is mixing, then learning it requires |H|^(Omega(1)) examples under a certain bound on the memory. Note that the class of parities is mixing. Moreover, as an immediate corollary, we get that most hypothesis classes are unlearnable with bounded memory. Our proof technique is combinatorial in nature and very different from previous analyses.} }
Endnote
%0 Conference Paper %T Mixing Implies Lower Bounds for Space Bounded Learning %A Dana Moshkovitz %A Michal Moshkovitz %B Proceedings of the 2017 Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2017 %E Satyen Kale %E Ohad Shamir %F pmlr-v65-moshkovitz17a %I PMLR %P 1516--1566 %U https://proceedings.mlr.press/v65/moshkovitz17a.html %V 65 %X One can learn any hypothesis class H with O(log |H|) labeled examples. Alas, learning with so few examples requires saving the examples in memory, and this requires |X|^(O(log|H|)) memory states, where X is the set of all labeled examples. This motivates the question of how many labeled examples are needed in case the memory is bounded. Previous work showed, using techniques such as linear algebra and Fourier analysis, that parities cannot be learned with bounded memory and less than |H|^(Omega(1)) examples. One might wonder whether a general combinatorial condition exists for unlearnability with bounded memory, as we have with the condition VCdim(H) = Infinity for PAC unlearnability. In this paper we give such a condition. We show that if an hypothesis class H, when viewed as a bipartite graph between hypotheses H and labeled examples X, is mixing, then learning it requires |H|^(Omega(1)) examples under a certain bound on the memory. Note that the class of parities is mixing. Moreover, as an immediate corollary, we get that most hypothesis classes are unlearnable with bounded memory. Our proof technique is combinatorial in nature and very different from previous analyses.
APA
Moshkovitz, D. & Moshkovitz, M.. (2017). Mixing Implies Lower Bounds for Space Bounded Learning. Proceedings of the 2017 Conference on Learning Theory, in Proceedings of Machine Learning Research 65:1516-1566 Available from https://proceedings.mlr.press/v65/moshkovitz17a.html.

Related Material