Non-Clashing Teaching Maps for Balls in Graphs

Jérémie Chalopin, Victor Chepoi, Fionn Mc Inerney, Sébastien Ratel
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:840-875, 2024.

Abstract

Recently, Kirkpatrick et al. [ALT 2019] and Fallat et al. [JMLR 2023] introduced non-clashing teaching and showed it to be the most efficient machine teaching model satisfying the benchmark for collusion-avoidance set by Goldman and Mathias. A teaching map $T$ for a concept class $\mathcal{C}$ assigns a (teaching) set $T(C)$ of examples to each concept $C \in \mathcal{C}$. A teaching map is non-clashing if no pair of concepts are consistent with the union of their teaching sets. The size of a non-clashing teaching map (NCTM) $T$ is the maximum size of a teaching set $T(C)$, $C \in \mathcal{C}$. The non-clashing teaching dimension $NCTD(\mathcal{C})$ of $\mathcal{C}$ is the minimum size of an NCTM for $\mathcal{C}$. NCTM$^+$ and NCTD$^+(\mathcal{C})$ are defined analogously, except the teacher may only use positive examples. We study NCTMs and NCTM$^+$s for the concept class $\mathcal{B}(G)$ consisting of all balls of a graph $G$. We show that the associated decision problem \textsc{B-NCTD$^+$} for NCTD$^+$ is \textsf{NP}-complete in split, co-bipartite, and bipartite graphs. Surprisingly, we even prove that, unless the \textsf{ETH} fails, \textsc{B-NCTD$^+$} does not admit an algorithm running in time $2^{2^{o(\mathtt{vc})}}\cdot n^{\mathcal{O}(1)}$, nor a kernelization algorithm outputting a kernel with $2^{o(\mathtt{vc})}$ vertices, where $\mathtt{vc}$ is the vertex cover number of $G$. These are extremely rare results: it is only the second (fourth, resp.) problem in \textsf{NP} to admit such a double-exponential lower bound parameterized by $\mathtt{vc}$ (treewidth, resp.), and only one of very few problems to admit such an \textsf{ETH}-based conditional lower bound on the number of vertices in a kernel. We complement these lower bounds with matching upper bounds. For trees, interval graphs, cycles, and trees of cycles, we derive NCTM$^+$s or NCTMs for $\mathcal{B}(G)$ of size proportional to its VC-dimension. For Gromov-hyperbolic graphs, we design an approximate NCTM$^+$ for $\mathcal{B}(G)$ of size $2$, in which only pairs of balls with Hausdorff distance larger than some constant must satisfy the non-clashing condition.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-chalopin24a, title = {Non-Clashing Teaching Maps for Balls in Graphs}, author = {Chalopin, J\'{e}r\'{e}mie and Chepoi, Victor and {Mc~Inerney}, Fionn and Ratel, S\'{e}bastien}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {840--875}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/chalopin24a/chalopin24a.pdf}, url = {https://proceedings.mlr.press/v247/chalopin24a.html}, abstract = {Recently, Kirkpatrick et al. [ALT 2019] and Fallat et al. [JMLR 2023] introduced non-clashing teaching and showed it to be the most efficient machine teaching model satisfying the benchmark for collusion-avoidance set by Goldman and Mathias. A teaching map $T$ for a concept class $\mathcal{C}$ assigns a (teaching) set $T(C)$ of examples to each concept $C \in \mathcal{C}$. A teaching map is non-clashing if no pair of concepts are consistent with the union of their teaching sets. The size of a non-clashing teaching map (NCTM) $T$ is the maximum size of a teaching set $T(C)$, $C \in \mathcal{C}$. The non-clashing teaching dimension $NCTD(\mathcal{C})$ of $\mathcal{C}$ is the minimum size of an NCTM for $\mathcal{C}$. NCTM$^+$ and NCTD$^+(\mathcal{C})$ are defined analogously, except the teacher may only use positive examples. We study NCTMs and NCTM$^+$s for the concept class $\mathcal{B}(G)$ consisting of all balls of a graph $G$. We show that the associated decision problem \textsc{B-NCTD$^+$} for NCTD$^+$ is \textsf{NP}-complete in split, co-bipartite, and bipartite graphs. Surprisingly, we even prove that, unless the \textsf{ETH} fails, \textsc{B-NCTD$^+$} does not admit an algorithm running in time $2^{2^{o(\mathtt{vc})}}\cdot n^{\mathcal{O}(1)}$, nor a kernelization algorithm outputting a kernel with $2^{o(\mathtt{vc})}$ vertices, where $\mathtt{vc}$ is the vertex cover number of $G$. These are extremely rare results: it is only the second (fourth, resp.) problem in \textsf{NP} to admit such a double-exponential lower bound parameterized by $\mathtt{vc}$ (treewidth, resp.), and only one of very few problems to admit such an \textsf{ETH}-based conditional lower bound on the number of vertices in a kernel. We complement these lower bounds with matching upper bounds. For trees, interval graphs, cycles, and trees of cycles, we derive NCTM$^+$s or NCTMs for $\mathcal{B}(G)$ of size proportional to its VC-dimension. For Gromov-hyperbolic graphs, we design an approximate NCTM$^+$ for $\mathcal{B}(G)$ of size $2$, in which only pairs of balls with Hausdorff distance larger than some constant must satisfy the non-clashing condition.} }
Endnote
%0 Conference Paper %T Non-Clashing Teaching Maps for Balls in Graphs %A Jérémie Chalopin %A Victor Chepoi %A Fionn Mc Inerney %A Sébastien Ratel %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-chalopin24a %I PMLR %P 840--875 %U https://proceedings.mlr.press/v247/chalopin24a.html %V 247 %X Recently, Kirkpatrick et al. [ALT 2019] and Fallat et al. [JMLR 2023] introduced non-clashing teaching and showed it to be the most efficient machine teaching model satisfying the benchmark for collusion-avoidance set by Goldman and Mathias. A teaching map $T$ for a concept class $\mathcal{C}$ assigns a (teaching) set $T(C)$ of examples to each concept $C \in \mathcal{C}$. A teaching map is non-clashing if no pair of concepts are consistent with the union of their teaching sets. The size of a non-clashing teaching map (NCTM) $T$ is the maximum size of a teaching set $T(C)$, $C \in \mathcal{C}$. The non-clashing teaching dimension $NCTD(\mathcal{C})$ of $\mathcal{C}$ is the minimum size of an NCTM for $\mathcal{C}$. NCTM$^+$ and NCTD$^+(\mathcal{C})$ are defined analogously, except the teacher may only use positive examples. We study NCTMs and NCTM$^+$s for the concept class $\mathcal{B}(G)$ consisting of all balls of a graph $G$. We show that the associated decision problem \textsc{B-NCTD$^+$} for NCTD$^+$ is \textsf{NP}-complete in split, co-bipartite, and bipartite graphs. Surprisingly, we even prove that, unless the \textsf{ETH} fails, \textsc{B-NCTD$^+$} does not admit an algorithm running in time $2^{2^{o(\mathtt{vc})}}\cdot n^{\mathcal{O}(1)}$, nor a kernelization algorithm outputting a kernel with $2^{o(\mathtt{vc})}$ vertices, where $\mathtt{vc}$ is the vertex cover number of $G$. These are extremely rare results: it is only the second (fourth, resp.) problem in \textsf{NP} to admit such a double-exponential lower bound parameterized by $\mathtt{vc}$ (treewidth, resp.), and only one of very few problems to admit such an \textsf{ETH}-based conditional lower bound on the number of vertices in a kernel. We complement these lower bounds with matching upper bounds. For trees, interval graphs, cycles, and trees of cycles, we derive NCTM$^+$s or NCTMs for $\mathcal{B}(G)$ of size proportional to its VC-dimension. For Gromov-hyperbolic graphs, we design an approximate NCTM$^+$ for $\mathcal{B}(G)$ of size $2$, in which only pairs of balls with Hausdorff distance larger than some constant must satisfy the non-clashing condition.
APA
Chalopin, J., Chepoi, V., Mc Inerney, F. & Ratel, S.. (2024). Non-Clashing Teaching Maps for Balls in Graphs. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:840-875 Available from https://proceedings.mlr.press/v247/chalopin24a.html.

Related Material