[edit]
Learning and testing junta distributions with sub cube conditioning
Proceedings of Thirty Fourth Conference on Learning Theory, PMLR 134:1060-1113, 2021.
Abstract
We study the problems of learning and testing junta distributions on {−1,1}n with respect to the uniform distribution, where a distribution p is a k-junta if its probability mass function p(x) depends on a subset of at most k variables. The main contribution is an algorithm for finding relevant coordinates in a k-junta distribution with subcube conditioning (Bhattacharyya et al 2018., Canonne et al. 2019). We give two applications: An algorithm for learning k-junta distributions with ˜O(k/ϵ2)logn+O(2k/ϵ2) subcube conditioning queries, and an algorithm for testing k-junta distributions with ˜O((k+√n)/ϵ2) subcube conditioning queries. All our algorithms are optimal up to poly-logarithmic factors.
Our results show that subcube conditioning, as a natural model for accessing high-dimensional distributions, enables significant savings in learning and testing junta distributions compared to the standard sampling model. This addresses an open question posed by Aliakbarpour et al. 2016.