Testable Learning of General Halfspaces with Adversarial Label Noise

Ilias Diakonikolas, Daniel Kane, Sihan Liu, Nikos Zarifis
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:1308-1335, 2024.

Abstract

We study the task of testable learning of general — not necessarily homogeneous — halfspaces with adversarial label noise with respect to the Gaussian distribution. In the testable learning framework, the goal is to develop a tester-learner such that if the data passes the tester, then one can trust the output of the robust learner on the data. Our main result is the first polynomial time tester-learner for general halfspaces that achieves dimension-independent misclassification error. At the heart of our approach is a new methodology to reduce testable learning of general halfspaces to testable learning of \snew{nearly} homogeneous halfspaces that may be of broader interest.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-diakonikolas24a, title = {Testable Learning of General Halfspaces with Adversarial Label Noise}, author = {Diakonikolas, Ilias and Kane, Daniel and Liu, Sihan and Zarifis, Nikos}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {1308--1335}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/diakonikolas24a/diakonikolas24a.pdf}, url = {https://proceedings.mlr.press/v247/diakonikolas24a.html}, abstract = {We study the task of testable learning of general — not necessarily homogeneous — halfspaces with adversarial label noise with respect to the Gaussian distribution. In the testable learning framework, the goal is to develop a tester-learner such that if the data passes the tester, then one can trust the output of the robust learner on the data. Our main result is the first polynomial time tester-learner for general halfspaces that achieves dimension-independent misclassification error. At the heart of our approach is a new methodology to reduce testable learning of general halfspaces to testable learning of \snew{nearly} homogeneous halfspaces that may be of broader interest. } }
Endnote
%0 Conference Paper %T Testable Learning of General Halfspaces with Adversarial Label Noise %A Ilias Diakonikolas %A Daniel Kane %A Sihan Liu %A Nikos Zarifis %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-diakonikolas24a %I PMLR %P 1308--1335 %U https://proceedings.mlr.press/v247/diakonikolas24a.html %V 247 %X We study the task of testable learning of general — not necessarily homogeneous — halfspaces with adversarial label noise with respect to the Gaussian distribution. In the testable learning framework, the goal is to develop a tester-learner such that if the data passes the tester, then one can trust the output of the robust learner on the data. Our main result is the first polynomial time tester-learner for general halfspaces that achieves dimension-independent misclassification error. At the heart of our approach is a new methodology to reduce testable learning of general halfspaces to testable learning of \snew{nearly} homogeneous halfspaces that may be of broader interest.
APA
Diakonikolas, I., Kane, D., Liu, S. & Zarifis, N.. (2024). Testable Learning of General Halfspaces with Adversarial Label Noise. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:1308-1335 Available from https://proceedings.mlr.press/v247/diakonikolas24a.html.

Related Material