[edit]
Optimal Collusion-Free Teaching
Proceedings of the 30th International Conference on Algorithmic Learning Theory, PMLR 98:506-528, 2019.
Abstract
Formal models of learning from teachers need to respect certain criteria to avoid collusion. The most commonly accepted notion of collusion-freeness was proposed by Goldman and Mathias (1996), and various teaching models obeying their criterion have been studied. For each model M and each concept class C, a parameter M-TD(C) refers to the \emph{teaching dimension} of concept class C in model M—defined to be the number of examples required for teaching a concept, in the worst case over all concepts in C.
This paper introduces a new model of teaching, called no-clash teaching, together with the corresponding parameter NCTD(C). No-clash teaching is provably optimal in the strong sense that, given \emph{any}\/{concept} class C and \emph{any}\/{model} M obeying Goldman and Mathias’s collusion-freeness criterion, one obtains NCTD(C)≤M-TD(C). We also study a corresponding notion NCTD+ for the case of learning from positive data only, establish useful bounds on NCTD and NCTD+, and discuss relations of these parameters to the VC-dimension and to sample compression.
In addition to formulating an optimal model of collusion-free teaching,
our main results are on the computational complexity of deciding whether NCTD+(C)=k (or NCTD(C)=k) for given C and k. We show some such decision problems to be equivalent to
the existence question for certain constrained matchings in bipartite
graphs. Our NP-hardness results for the latter are of independent interest in the study of constrained graph matchings.