The title of the paper

Caleb Koch, Carmen Strassle, Li-Yang Tan
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:2979-3010, 2024.

Abstract

We consider the task of properly PAC learning decision trees with queries. Recent work of Koch, Strassle, and Tan showed that the strictest version of this task, where the hypothesis tree T is required to be optimally small, is NP-hard. Their work leaves open the question of whether the task remains intractable if T is only required to be close to optimal, say within a factor of 2, rather than exactly optimal. We answer this affirmatively and show that the task indeed remains NP-hard even if T is allowed to be within any constant factor of optimal. More generally, our result allows for a smooth tradeoff between the hardness assumption and inapproximability factor. As Koch et al.’s techniques do not appear to be amenable to such a strengthening, we first recover their result with a new and simpler proof, which we couple with a new XOR lemma for decision trees. While there is a large body of work on XOR lemmas for decision trees, our setting necessitates parameters that are extremely sharp and are not known to be attainable by existing such lemmas. Our work also carries new implications for the related problem of Decision Tree Minimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-koch24a, title = {The title of the paper}, author = {Koch, Caleb and Strassle, Carmen and Tan, Li-Yang}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {2979--3010}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/koch24a/koch24a.pdf}, url = {https://proceedings.mlr.press/v247/koch24a.html}, abstract = {We consider the task of properly PAC learning decision trees with queries. Recent work of Koch, Strassle, and Tan showed that the strictest version of this task, where the hypothesis tree T is required to be optimally small, is NP-hard. Their work leaves open the question of whether the task remains intractable if T is only required to be close to optimal, say within a factor of 2, rather than exactly optimal. We answer this affirmatively and show that the task indeed remains NP-hard even if T is allowed to be within any constant factor of optimal. More generally, our result allows for a smooth tradeoff between the hardness assumption and inapproximability factor. As Koch et al.’s techniques do not appear to be amenable to such a strengthening, we first recover their result with a new and simpler proof, which we couple with a new XOR lemma for decision trees. While there is a large body of work on XOR lemmas for decision trees, our setting necessitates parameters that are extremely sharp and are not known to be attainable by existing such lemmas. Our work also carries new implications for the related problem of Decision Tree Minimization. } }
Endnote
%0 Conference Paper %T The title of the paper %A Caleb Koch %A Carmen Strassle %A Li-Yang Tan %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-koch24a %I PMLR %P 2979--3010 %U https://proceedings.mlr.press/v247/koch24a.html %V 247 %X We consider the task of properly PAC learning decision trees with queries. Recent work of Koch, Strassle, and Tan showed that the strictest version of this task, where the hypothesis tree T is required to be optimally small, is NP-hard. Their work leaves open the question of whether the task remains intractable if T is only required to be close to optimal, say within a factor of 2, rather than exactly optimal. We answer this affirmatively and show that the task indeed remains NP-hard even if T is allowed to be within any constant factor of optimal. More generally, our result allows for a smooth tradeoff between the hardness assumption and inapproximability factor. As Koch et al.’s techniques do not appear to be amenable to such a strengthening, we first recover their result with a new and simpler proof, which we couple with a new XOR lemma for decision trees. While there is a large body of work on XOR lemmas for decision trees, our setting necessitates parameters that are extremely sharp and are not known to be attainable by existing such lemmas. Our work also carries new implications for the related problem of Decision Tree Minimization.
APA
Koch, C., Strassle, C. & Tan, L.. (2024). The title of the paper. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:2979-3010 Available from https://proceedings.mlr.press/v247/koch24a.html.

Related Material