Efficient Private Algorithms for Learning Large-Margin Halfspaces
Proceedings of the 31st International Conference on Algorithmic Learning Theory, PMLR 117:704-724, 2020.
We present new differentially private algorithms for learning a large-margin halfspace. In contrast to previous algorithms, which are based on either differentially private simulations of the statistical query model or on private convex optimization, the sample complexity of our algorithms depends only on the margin of the data, and not on the dimension. We complement our results with a lower bound, showing that the dependence of our upper bounds on the margin is optimal.