ID3 Learns Juntas for Smoothed Product Distributions

Alon Brutzkus, Amit Daniely, Eran Malach
; Proceedings of Thirty Third Conference on Learning Theory, PMLR 125:902-915, 2020.

Abstract

In recent years, there are many attempts to understand popular heuristics. An example of such heuristic algorithm is the ID3 algorithm for learning decision trees. This algorithm is commonly used in practice, but there are very few theoretical works studying its behavior. In this paper, we analyze the ID3 algorithm, when the target function is a $k$-Junta, a function that depends on $k$ out of $n$ variables of the input. We prove that when $k = \log n$, the ID3 algorithm learns in polynomial time $k$-Juntas, in the smoothed analysis model of Kalai and Teng (2008). That is, we show a learnability result when the observed distribution is a “noisy” variant of the original distribution.

Cite this Paper


BibTeX
@InProceedings{pmlr-v125-brutzkus20a, title = {ID3 Learns Juntas for Smoothed Product Distributions}, author = {Brutzkus, Alon and Daniely, Amit and Malach, Eran}, pages = {902--915}, year = {2020}, editor = {Jacob Abernethy and Shivani Agarwal}, volume = {125}, series = {Proceedings of Machine Learning Research}, address = {}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v125/brutzkus20a/brutzkus20a.pdf}, url = {http://proceedings.mlr.press/v125/brutzkus20a.html}, abstract = { In recent years, there are many attempts to understand popular heuristics. An example of such heuristic algorithm is the ID3 algorithm for learning decision trees. This algorithm is commonly used in practice, but there are very few theoretical works studying its behavior. In this paper, we analyze the ID3 algorithm, when the target function is a $k$-Junta, a function that depends on $k$ out of $n$ variables of the input. We prove that when $k = \log n$, the ID3 algorithm learns in polynomial time $k$-Juntas, in the smoothed analysis model of Kalai and Teng (2008). That is, we show a learnability result when the observed distribution is a “noisy” variant of the original distribution.} }
Endnote
%0 Conference Paper %T ID3 Learns Juntas for Smoothed Product Distributions %A Alon Brutzkus %A Amit Daniely %A Eran Malach %B Proceedings of Thirty Third Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2020 %E Jacob Abernethy %E Shivani Agarwal %F pmlr-v125-brutzkus20a %I PMLR %J Proceedings of Machine Learning Research %P 902--915 %U http://proceedings.mlr.press %V 125 %W PMLR %X In recent years, there are many attempts to understand popular heuristics. An example of such heuristic algorithm is the ID3 algorithm for learning decision trees. This algorithm is commonly used in practice, but there are very few theoretical works studying its behavior. In this paper, we analyze the ID3 algorithm, when the target function is a $k$-Junta, a function that depends on $k$ out of $n$ variables of the input. We prove that when $k = \log n$, the ID3 algorithm learns in polynomial time $k$-Juntas, in the smoothed analysis model of Kalai and Teng (2008). That is, we show a learnability result when the observed distribution is a “noisy” variant of the original distribution.
APA
Brutzkus, A., Daniely, A. & Malach, E.. (2020). ID3 Learns Juntas for Smoothed Product Distributions. Proceedings of Thirty Third Conference on Learning Theory, in PMLR 125:902-915

Related Material