Improved Algorithms for Efficient Active Learning Halfspaces with Massart and Tsybakov Noise

Chicheng Zhang, Yinan Li
Proceedings of Thirty Fourth Conference on Learning Theory, PMLR 134:4526-4527, 2021.

Abstract

We give a computationally-efficient PAC active learning algorithm for $d$-dimensional homogeneous halfspaces that can tolerate Massart noise (Massart and Nedelec, 2006) and Tsybakov noise (Tsybakov, 2004). Specialized to the $\eta$-Massart noise setting, our algorithm achieves an information-theoretically near-optimal label complexity of $\tilde{O}\left( \frac{d}{(1-2\eta)^2} \mathrm{polylog}(\frac1\epsilon) \right)$ under a wide range of unlabeled data distributions (specifically, the family of ``structured distributions' defined in Diakonikolas et al. (2020)). Under the more challenging Tsybakov noise condition, we identify two subfamilies of noise conditions, under which our efficient algorithm provides label complexity guarantees strictly lower than passive learning algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v134-zhang21a, title = {Improved Algorithms for Efficient Active Learning Halfspaces with Massart and Tsybakov Noise}, author = {Zhang, Chicheng and Li, Yinan}, booktitle = {Proceedings of Thirty Fourth Conference on Learning Theory}, pages = {4526--4527}, year = {2021}, editor = {Belkin, Mikhail and Kpotufe, Samory}, volume = {134}, series = {Proceedings of Machine Learning Research}, month = {15--19 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v134/zhang21a/zhang21a.pdf}, url = {https://proceedings.mlr.press/v134/zhang21a.html}, abstract = {We give a computationally-efficient PAC active learning algorithm for $d$-dimensional homogeneous halfspaces that can tolerate Massart noise (Massart and Nedelec, 2006) and Tsybakov noise (Tsybakov, 2004). Specialized to the $\eta$-Massart noise setting, our algorithm achieves an information-theoretically near-optimal label complexity of $\tilde{O}\left( \frac{d}{(1-2\eta)^2} \mathrm{polylog}(\frac1\epsilon) \right)$ under a wide range of unlabeled data distributions (specifically, the family of ``structured distributions' defined in Diakonikolas et al. (2020)). Under the more challenging Tsybakov noise condition, we identify two subfamilies of noise conditions, under which our efficient algorithm provides label complexity guarantees strictly lower than passive learning algorithms.} }
Endnote
%0 Conference Paper %T Improved Algorithms for Efficient Active Learning Halfspaces with Massart and Tsybakov Noise %A Chicheng Zhang %A Yinan Li %B Proceedings of Thirty Fourth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2021 %E Mikhail Belkin %E Samory Kpotufe %F pmlr-v134-zhang21a %I PMLR %P 4526--4527 %U https://proceedings.mlr.press/v134/zhang21a.html %V 134 %X We give a computationally-efficient PAC active learning algorithm for $d$-dimensional homogeneous halfspaces that can tolerate Massart noise (Massart and Nedelec, 2006) and Tsybakov noise (Tsybakov, 2004). Specialized to the $\eta$-Massart noise setting, our algorithm achieves an information-theoretically near-optimal label complexity of $\tilde{O}\left( \frac{d}{(1-2\eta)^2} \mathrm{polylog}(\frac1\epsilon) \right)$ under a wide range of unlabeled data distributions (specifically, the family of ``structured distributions' defined in Diakonikolas et al. (2020)). Under the more challenging Tsybakov noise condition, we identify two subfamilies of noise conditions, under which our efficient algorithm provides label complexity guarantees strictly lower than passive learning algorithms.
APA
Zhang, C. & Li, Y.. (2021). Improved Algorithms for Efficient Active Learning Halfspaces with Massart and Tsybakov Noise. Proceedings of Thirty Fourth Conference on Learning Theory, in Proceedings of Machine Learning Research 134:4526-4527 Available from https://proceedings.mlr.press/v134/zhang21a.html.

Related Material