Sample Compression for Multi-label Concept Classes

[edit]

Rahim Samei, Pavel Semukhin, Boting Yang, Sandra Zilles ;
Proceedings of The 27th Conference on Learning Theory, PMLR 35:371-393, 2014.

Abstract

This paper studies labeled sample compression for multi-label concept classes. For a specific extension of the notion of VC-dimension to multi-label classes, we prove that every maximum multi-label class of dimension d has a sample compression scheme in which every sample is compressed to a subset of size at most d. We further show that every multi-label class of dimension 1 has a sample compression scheme using only sets of size at most 1. As opposed to the binary case, the latter result is not immediately implied by the former, since there are multi-label concept classes of dimension 1 that are not contained in maximum classes of dimension 1.

Related Material