[edit]
Learning Using Local Membership Queries
Proceedings of the 26th Annual Conference on Learning Theory, PMLR 30:398-431, 2013.
Abstract
We introduce a new model of membership query (MQ) learning, where the learning algorithm is restricted to query points that are close to random examples drawn from the underlying distribution. The learning model is intermediate between the PAC model (Valiant,1984) and the PAC+MQ model (where the queries are allowed to be arbitrary points). Membership query algorithms are not popular among machine learning practitioners. Apart from the obvious difficulty of adaptively querying labellers, it has also been observed that querying unnatural points leads to increased noise from human labellers (Lang and Baum, 1992). This motivates our study of learning algorithms that make queries that are close to examples generated from the data distribution. We restrict our attention to functions defined on the n-dimensional Boolean hypercube and say that a membership query is local if its Hamming distance from some example in the (random) training data is at most O(log(n)). We show the following results in this model: \beginenumerate \item The class of sparse polynomials (with coefficients in R) over {0, 1}^n is polynomial time learnable under a large class of locally smooth distributions using O(log(n))-local queries. This class also includes the class of O(log(n))-depth decision trees. \item The class of polynomial-sized decision trees is polynomial time learnable under product distributions using O(log(n))-local queries. \item The class of polynomial size DNF formulas is learnable under the uniform distribution using O(log(n))-local queries in time n^O(log(log(n))). \item In addition we prove a number of results relating the proposed model to the traditional PAC model and the PAC+MQ model. \endenumerate