[edit]
Bounded Memory Active Learning through Enriched Queries
Proceedings of Thirty Fourth Conference on Learning Theory, PMLR 134:2358-2387, 2021.
Abstract
The explosive growth of easily-accessible unlabeled data has lead to growing interest in \emph{active learning}, a paradigm in which data-hungry learning algorithms adaptively select informative examples in order to lower prohibitively expensive labeling costs. Unfortunately, in standard worst-case models of learning, the active setting often provides no improvement over non-adaptive algorithms. To combat this, a series of recent works have considered a model in which the learner may ask \emph{enriched} queries beyond labels. While such models have seen success in drastically lowering label costs, they tend to come at the expense of requiring large amounts of memory. In this work, we study what families of classifiers can be learned in \emph{bounded memory}. To this end, we introduce a novel streaming-variant of enriched-query active learning along with a natural combinatorial parameter called \emph{lossless sample compression} that is sufficient for learning not only with bounded memory, but in a query-optimal and computationally efficient manner as well. Finally, we give three fundamental examples of classifier families with small, easy to compute lossless compression schemes when given access to basic enriched queries: axis-aligned rectangles, decision trees, and halfspaces in two dimensions.