Learning and Testing Junta Distributions

[edit]

Maryam Aliakbarpour, Eric Blais, Ronitt Rubinfeld ;
29th Annual Conference on Learning Theory, PMLR 49:19-46, 2016.

Abstract

We consider the problem of learning distributions in the presence of irrelevant features. This problem is formalized by introducing a new notion of \emphk-junta distributions. Informally, a distribution \mathcalD over the domain \mathcalX^n is a \emphk-junta distribution with respect to another distribution \mathcalU over the same domain if there is a set J ⊆[n] of size |J| \le k that captures the difference between \mathcal D and \mathcalU. We show that it is possible to learn k-junta distributions with respect to the uniform distribution over the Boolean hypercube {0,1}^n in time \poly(n^k, 1/ε). This result is obtained via a new Fourier-based learning algorithm inspired by the Low-Degree Algorithm of Linial, Mansour, and Nisan (1993). We also consider the problem of testing whether an unknown distribution is a k-junta distribution with respect to the uniform distribution. We give a nearly-optimal algorithm for this task. Both the analysis of the algorithm and the lower bound showing its optimality are obtained by establishing connections between the problem of testing junta distributions and testing uniformity of weighted collections of distributions.

Related Material