Distribution Free Learning with Local Queries

Galit Bary-Weisberg, Amit Daniely, Shai Shalev-Shwartz
Proceedings of the 31st International Conference on Algorithmic Learning Theory, PMLR 117:133-147, 2020.

Abstract

The model of learning with local membership queries interpolates between the PAC model and the membership queries model by allowing the learner to query the label of any example that is similar to an example in the training set. This model, recently proposed and studied by Aawasthi et al (2012), aims to facilitate practical use of membership queries. We continue this line of work, proving both positive and negative results in the distribution free setting. We restrict to the boolean cube $\{-1, 1\}^n$, and say that a query is $q$-local if it is of a hamming distance $\le q$ from some training example. On the positive side, we show that $1$-local queries already give an additional strength, and allow to learn a certain type of DNF formulas, that are not learnable without queries, assuming that learning decision trees is hard. On the negative side, we show that even $\left(n^{0.99}\right)$-local queries cannot help to learn various classes including Automata, DNFs and more. Likewise, $q$-local queries for any constant $q$ cannot help to learn Juntas, Decision Trees, Sparse Polynomials and more. Moreover, for these classes, an algorithm that uses $\left(\log^{0.99}(n)\right)$-local queries would lead to a breakthrough in the best known running times

Cite this Paper


BibTeX
@InProceedings{pmlr-v117-bary-weisberg20a, title = {Distribution Free Learning with Local Queries}, author = {Bary-Weisberg, Galit and Daniely, Amit and Shalev-Shwartz, Shai}, booktitle = {Proceedings of the 31st International Conference on Algorithmic Learning Theory}, pages = {133--147}, year = {2020}, editor = {Kontorovich, Aryeh and Neu, Gergely}, volume = {117}, series = {Proceedings of Machine Learning Research}, month = {08 Feb--11 Feb}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v117/bary-weisberg20a/bary-weisberg20a.pdf}, url = {https://proceedings.mlr.press/v117/bary-weisberg20a.html}, abstract = { The model of learning with local membership queries interpolates between the PAC model and the membership queries model by allowing the learner to query the label of any example that is similar to an example in the training set. This model, recently proposed and studied by Aawasthi et al (2012), aims to facilitate practical use of membership queries. We continue this line of work, proving both positive and negative results in the distribution free setting. We restrict to the boolean cube $\{-1, 1\}^n$, and say that a query is $q$-local if it is of a hamming distance $\le q$ from some training example. On the positive side, we show that $1$-local queries already give an additional strength, and allow to learn a certain type of DNF formulas, that are not learnable without queries, assuming that learning decision trees is hard. On the negative side, we show that even $\left(n^{0.99}\right)$-local queries cannot help to learn various classes including Automata, DNFs and more. Likewise, $q$-local queries for any constant $q$ cannot help to learn Juntas, Decision Trees, Sparse Polynomials and more. Moreover, for these classes, an algorithm that uses $\left(\log^{0.99}(n)\right)$-local queries would lead to a breakthrough in the best known running times} }
Endnote
%0 Conference Paper %T Distribution Free Learning with Local Queries %A Galit Bary-Weisberg %A Amit Daniely %A Shai Shalev-Shwartz %B Proceedings of the 31st International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2020 %E Aryeh Kontorovich %E Gergely Neu %F pmlr-v117-bary-weisberg20a %I PMLR %P 133--147 %U https://proceedings.mlr.press/v117/bary-weisberg20a.html %V 117 %X The model of learning with local membership queries interpolates between the PAC model and the membership queries model by allowing the learner to query the label of any example that is similar to an example in the training set. This model, recently proposed and studied by Aawasthi et al (2012), aims to facilitate practical use of membership queries. We continue this line of work, proving both positive and negative results in the distribution free setting. We restrict to the boolean cube $\{-1, 1\}^n$, and say that a query is $q$-local if it is of a hamming distance $\le q$ from some training example. On the positive side, we show that $1$-local queries already give an additional strength, and allow to learn a certain type of DNF formulas, that are not learnable without queries, assuming that learning decision trees is hard. On the negative side, we show that even $\left(n^{0.99}\right)$-local queries cannot help to learn various classes including Automata, DNFs and more. Likewise, $q$-local queries for any constant $q$ cannot help to learn Juntas, Decision Trees, Sparse Polynomials and more. Moreover, for these classes, an algorithm that uses $\left(\log^{0.99}(n)\right)$-local queries would lead to a breakthrough in the best known running times
APA
Bary-Weisberg, G., Daniely, A. & Shalev-Shwartz, S.. (2020). Distribution Free Learning with Local Queries. Proceedings of the 31st International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 117:133-147 Available from https://proceedings.mlr.press/v117/bary-weisberg20a.html.

Related Material