Efficient, Noise-Tolerant, and Private Learning via Boosting

Mark Bun, Marco Leandro Carmosino, Jessica Sorrell
Proceedings of Thirty Third Conference on Learning Theory, PMLR 125:1031-1077, 2020.

Abstract

We introduce a simple framework for designing private boosting algorithms. We give natural conditions under which these algorithms are differentially private, efficient, and noise-tolerant PAC learners. To demonstrate our framework, we use it to construct noise-tolerant and private PAC learners for large-margin halfspaces whose sample complexity does not depend on the dimension. We give two sample complexity bounds for our large-margin halfspace learner. One bound is based only on differential privacy, and uses this guarantee as an asset for ensuring generalization. This first bound illustrates a general methodology for obtaining PAC learners from privacy, which may be of independent interest. The second bound uses standard techniques from the theory of large-margin classification (the fat-shattering dimension) to match the best known sample complexity for differentially private learning of large-margin halfspaces, while additionally tolerating random label noise.

Cite this Paper


BibTeX
@InProceedings{pmlr-v125-bun20a, title = {Efficient, Noise-Tolerant, and Private Learning via Boosting}, author = {Bun, Mark and Carmosino, Marco Leandro and Sorrell, Jessica}, booktitle = {Proceedings of Thirty Third Conference on Learning Theory}, pages = {1031--1077}, year = {2020}, editor = {Abernethy, Jacob and Agarwal, Shivani}, volume = {125}, series = {Proceedings of Machine Learning Research}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v125/bun20a/bun20a.pdf}, url = {https://proceedings.mlr.press/v125/bun20a.html}, abstract = { We introduce a simple framework for designing private boosting algorithms. We give natural conditions under which these algorithms are differentially private, efficient, and noise-tolerant PAC learners. To demonstrate our framework, we use it to construct noise-tolerant and private PAC learners for large-margin halfspaces whose sample complexity does not depend on the dimension. We give two sample complexity bounds for our large-margin halfspace learner. One bound is based only on differential privacy, and uses this guarantee as an asset for ensuring generalization. This first bound illustrates a general methodology for obtaining PAC learners from privacy, which may be of independent interest. The second bound uses standard techniques from the theory of large-margin classification (the fat-shattering dimension) to match the best known sample complexity for differentially private learning of large-margin halfspaces, while additionally tolerating random label noise.} }
Endnote
%0 Conference Paper %T Efficient, Noise-Tolerant, and Private Learning via Boosting %A Mark Bun %A Marco Leandro Carmosino %A Jessica Sorrell %B Proceedings of Thirty Third Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2020 %E Jacob Abernethy %E Shivani Agarwal %F pmlr-v125-bun20a %I PMLR %P 1031--1077 %U https://proceedings.mlr.press/v125/bun20a.html %V 125 %X We introduce a simple framework for designing private boosting algorithms. We give natural conditions under which these algorithms are differentially private, efficient, and noise-tolerant PAC learners. To demonstrate our framework, we use it to construct noise-tolerant and private PAC learners for large-margin halfspaces whose sample complexity does not depend on the dimension. We give two sample complexity bounds for our large-margin halfspace learner. One bound is based only on differential privacy, and uses this guarantee as an asset for ensuring generalization. This first bound illustrates a general methodology for obtaining PAC learners from privacy, which may be of independent interest. The second bound uses standard techniques from the theory of large-margin classification (the fat-shattering dimension) to match the best known sample complexity for differentially private learning of large-margin halfspaces, while additionally tolerating random label noise.
APA
Bun, M., Carmosino, M.L. & Sorrell, J.. (2020). Efficient, Noise-Tolerant, and Private Learning via Boosting. Proceedings of Thirty Third Conference on Learning Theory, in Proceedings of Machine Learning Research 125:1031-1077 Available from https://proceedings.mlr.press/v125/bun20a.html.

Related Material