Memory, Communication, and Statistical Queries

Jacob Steinhardt, Gregory Valiant, Stefan Wager
29th Annual Conference on Learning Theory, PMLR 49:1490-1516, 2016.

Abstract

If a concept class can be represented with a certain amount of memory, can it be efficiently learned with the same amount of memory? What concepts can be efficiently learned by algorithms that extract only a few bits of information from each example? We introduce a formal framework for studying these questions, and investigate the relationship between the fundamental resources of memory or communication and the sample complexity of the learning task. We relate our memory-bounded and communication-bounded learning models to the well-studied statistical query model. This connection can be leveraged to obtain both upper and lower bounds: we show strong lower bounds on learning parity functions with bounded communication, as well as upper bounds on solving sparse linear regression problems with limited memory.

Cite this Paper


BibTeX
@InProceedings{pmlr-v49-steinhardt16, title = {Memory, Communication, and Statistical Queries}, author = {Steinhardt, Jacob and Valiant, Gregory and Wager, Stefan}, booktitle = {29th Annual Conference on Learning Theory}, pages = {1490--1516}, year = {2016}, editor = {Feldman, Vitaly and Rakhlin, Alexander and Shamir, Ohad}, volume = {49}, series = {Proceedings of Machine Learning Research}, address = {Columbia University, New York, New York, USA}, month = {23--26 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v49/steinhardt16.pdf}, url = {https://proceedings.mlr.press/v49/steinhardt16.html}, abstract = {If a concept class can be represented with a certain amount of memory, can it be efficiently learned with the same amount of memory? What concepts can be efficiently learned by algorithms that extract only a few bits of information from each example? We introduce a formal framework for studying these questions, and investigate the relationship between the fundamental resources of memory or communication and the sample complexity of the learning task. We relate our memory-bounded and communication-bounded learning models to the well-studied statistical query model. This connection can be leveraged to obtain both upper and lower bounds: we show strong lower bounds on learning parity functions with bounded communication, as well as upper bounds on solving sparse linear regression problems with limited memory.} }
Endnote
%0 Conference Paper %T Memory, Communication, and Statistical Queries %A Jacob Steinhardt %A Gregory Valiant %A Stefan Wager %B 29th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2016 %E Vitaly Feldman %E Alexander Rakhlin %E Ohad Shamir %F pmlr-v49-steinhardt16 %I PMLR %P 1490--1516 %U https://proceedings.mlr.press/v49/steinhardt16.html %V 49 %X If a concept class can be represented with a certain amount of memory, can it be efficiently learned with the same amount of memory? What concepts can be efficiently learned by algorithms that extract only a few bits of information from each example? We introduce a formal framework for studying these questions, and investigate the relationship between the fundamental resources of memory or communication and the sample complexity of the learning task. We relate our memory-bounded and communication-bounded learning models to the well-studied statistical query model. This connection can be leveraged to obtain both upper and lower bounds: we show strong lower bounds on learning parity functions with bounded communication, as well as upper bounds on solving sparse linear regression problems with limited memory.
RIS
TY - CPAPER TI - Memory, Communication, and Statistical Queries AU - Jacob Steinhardt AU - Gregory Valiant AU - Stefan Wager BT - 29th Annual Conference on Learning Theory DA - 2016/06/06 ED - Vitaly Feldman ED - Alexander Rakhlin ED - Ohad Shamir ID - pmlr-v49-steinhardt16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 49 SP - 1490 EP - 1516 L1 - http://proceedings.mlr.press/v49/steinhardt16.pdf UR - https://proceedings.mlr.press/v49/steinhardt16.html AB - If a concept class can be represented with a certain amount of memory, can it be efficiently learned with the same amount of memory? What concepts can be efficiently learned by algorithms that extract only a few bits of information from each example? We introduce a formal framework for studying these questions, and investigate the relationship between the fundamental resources of memory or communication and the sample complexity of the learning task. We relate our memory-bounded and communication-bounded learning models to the well-studied statistical query model. This connection can be leveraged to obtain both upper and lower bounds: we show strong lower bounds on learning parity functions with bounded communication, as well as upper bounds on solving sparse linear regression problems with limited memory. ER -
APA
Steinhardt, J., Valiant, G. & Wager, S.. (2016). Memory, Communication, and Statistical Queries. 29th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 49:1490-1516 Available from https://proceedings.mlr.press/v49/steinhardt16.html.

Related Material