[edit]
Information-Computation Tradeoffs for Learning Margin Halfspaces with Random Classification Noise
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:2211-2239, 2023.
Abstract
We study the problem of PAC learning γ-margin halfspaces with Random Classification Noise. We establish an information-computation tradeoffsuggesting an inherent gap between the sample complexity of the problem and the sample complexity of computationally efficient algorithms. Concretely, the sample complexity of the problem is ˜Θ(1/(γ2ϵ)). We start by giving a simple efficient algorithm with sample complexity ˜O(1/(γ2ϵ2)). Our main resultis a lower bound for Statistical Query (SQ) algorithms and low-degree polynomial tests suggesting that the quadratic dependence on 1/ϵ in the sample complexity is inherent for computationally efficient algorithms.Specifically, our results imply a lower bound of ˜Ω(1/(γ1/2ϵ2)) on the sample complexity of any efficient SQ learner or low-degree test.