Learning General Halfspaces with Adversarial Label Noise via Online Gradient Descent

Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:5118-5141, 2022.

Abstract

We study the problem of learning general {—} i.e., not necessarily homogeneous {—} halfspaces with adversarial label noise under the Gaussian distribution. Prior work has provided a sophisticated polynomial-time algorithm for this problem. In this work, we show that the problem can be solved directly via online gradient descent applied to a sequence of natural non-convex surrogates. This approach yields a simple iterative learning algorithm for general halfspaces with near-optimal sample complexity, runtime, and error guarantee. At the conceptual level, our work establishes an intriguing connection between learning halfspaces with adversarial noise and online optimization that may find other applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-diakonikolas22b, title = {Learning General Halfspaces with Adversarial Label Noise via Online Gradient Descent}, author = {Diakonikolas, Ilias and Kontonis, Vasilis and Tzamos, Christos and Zarifis, Nikos}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {5118--5141}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/diakonikolas22b/diakonikolas22b.pdf}, url = {https://proceedings.mlr.press/v162/diakonikolas22b.html}, abstract = {We study the problem of learning general {—} i.e., not necessarily homogeneous {—} halfspaces with adversarial label noise under the Gaussian distribution. Prior work has provided a sophisticated polynomial-time algorithm for this problem. In this work, we show that the problem can be solved directly via online gradient descent applied to a sequence of natural non-convex surrogates. This approach yields a simple iterative learning algorithm for general halfspaces with near-optimal sample complexity, runtime, and error guarantee. At the conceptual level, our work establishes an intriguing connection between learning halfspaces with adversarial noise and online optimization that may find other applications.} }
Endnote
%0 Conference Paper %T Learning General Halfspaces with Adversarial Label Noise via Online Gradient Descent %A Ilias Diakonikolas %A Vasilis Kontonis %A Christos Tzamos %A Nikos Zarifis %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-diakonikolas22b %I PMLR %P 5118--5141 %U https://proceedings.mlr.press/v162/diakonikolas22b.html %V 162 %X We study the problem of learning general {—} i.e., not necessarily homogeneous {—} halfspaces with adversarial label noise under the Gaussian distribution. Prior work has provided a sophisticated polynomial-time algorithm for this problem. In this work, we show that the problem can be solved directly via online gradient descent applied to a sequence of natural non-convex surrogates. This approach yields a simple iterative learning algorithm for general halfspaces with near-optimal sample complexity, runtime, and error guarantee. At the conceptual level, our work establishes an intriguing connection between learning halfspaces with adversarial noise and online optimization that may find other applications.
APA
Diakonikolas, I., Kontonis, V., Tzamos, C. & Zarifis, N.. (2022). Learning General Halfspaces with Adversarial Label Noise via Online Gradient Descent. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:5118-5141 Available from https://proceedings.mlr.press/v162/diakonikolas22b.html.

Related Material