[edit]
Non-Clashing Teaching Maps for Balls in Graphs
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:840-875, 2024.
Abstract
Recently, Kirkpatrick et al. [ALT 2019] and Fallat et al. [JMLR 2023] introduced non-clashing teaching and showed it to be the most efficient machine teaching model satisfying the benchmark for collusion-avoidance set by Goldman and Mathias. A teaching map T for a concept class C assigns a (teaching) set T(C) of examples to each concept C∈C. A teaching map is non-clashing if no pair of concepts are consistent with the union of their teaching sets. The size of a non-clashing teaching map (NCTM) T is the maximum size of a teaching set T(C), C∈C. The non-clashing teaching dimension NCTD(C) of C is the minimum size of an NCTM for C. NCTM+ and NCTD+(C) are defined analogously, except the teacher may only use positive examples.
We study NCTMs and NCTM+s for the concept class B(G) consisting of all balls of a graph G. We show that the associated decision problem B-NCTD+ for NCTD+ is NP-complete in split, co-bipartite, and bipartite graphs. Surprisingly, we even prove that, unless the ETH fails, B-NCTD+ does not admit an algorithm running in time 22o(vc)⋅nO(1), nor a kernelization algorithm outputting a kernel with 2o(vc) vertices, where vc is the vertex cover number of G. We complement these lower bounds with matching upper bounds. These are extremely rare results: it is only the second problem in NP to admit such a tight double-exponential lower bound parameterized by vc, and only one of very few problems to admit such an ETH-based conditional lower bound on the number of vertices in a kernel. For trees, interval graphs, cycles, and trees of cycles, we derive NCTM+s or NCTMs for B(G) of size proportional to its VC-dimension. For Gromov-hyperbolic graphs, we design an approximate NCTM+ for B(G) of size 2, in which only pairs of balls with Hausdorff distance larger than some constant must satisfy the non-clashing condition.