Learning Halfspaces with Massart Noise Under Structured Distributions

Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis
; Proceedings of Thirty Third Conference on Learning Theory, PMLR 125:1486-1513, 2020.

Abstract

We study the problem of learning halfspaces with Massart noise in the distribution-specific PAC model. We give the first computationally efficient algorithm for this problem with respect to a broad family of distributions, including log-concave distributions. This resolves an open question posed in a number of prior works. Our approach is extremely simple: We identify a smooth {\em non-convex} surrogate loss with the property that any approximate stationary point of this loss defines a halfspace that is close to the target halfspace. Given this structural result, we can use SGD to solve the underlying learning problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v125-diakonikolas20c, title = {Learning Halfspaces with Massart Noise Under Structured Distributions}, author = {Diakonikolas, Ilias and Kontonis, Vasilis and Tzamos, Christos and Zarifis, Nikos}, pages = {1486--1513}, year = {2020}, editor = {Jacob Abernethy and Shivani Agarwal}, volume = {125}, series = {Proceedings of Machine Learning Research}, address = {}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v125/diakonikolas20c/diakonikolas20c.pdf}, url = {http://proceedings.mlr.press/v125/diakonikolas20c.html}, abstract = { We study the problem of learning halfspaces with Massart noise in the distribution-specific PAC model. We give the first computationally efficient algorithm for this problem with respect to a broad family of distributions, including log-concave distributions. This resolves an open question posed in a number of prior works. Our approach is extremely simple: We identify a smooth {\em non-convex} surrogate loss with the property that any approximate stationary point of this loss defines a halfspace that is close to the target halfspace. Given this structural result, we can use SGD to solve the underlying learning problem.} }
Endnote
%0 Conference Paper %T Learning Halfspaces with Massart Noise Under Structured Distributions %A Ilias Diakonikolas %A Vasilis Kontonis %A Christos Tzamos %A Nikos Zarifis %B Proceedings of Thirty Third Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2020 %E Jacob Abernethy %E Shivani Agarwal %F pmlr-v125-diakonikolas20c %I PMLR %J Proceedings of Machine Learning Research %P 1486--1513 %U http://proceedings.mlr.press %V 125 %W PMLR %X We study the problem of learning halfspaces with Massart noise in the distribution-specific PAC model. We give the first computationally efficient algorithm for this problem with respect to a broad family of distributions, including log-concave distributions. This resolves an open question posed in a number of prior works. Our approach is extremely simple: We identify a smooth {\em non-convex} surrogate loss with the property that any approximate stationary point of this loss defines a halfspace that is close to the target halfspace. Given this structural result, we can use SGD to solve the underlying learning problem.
APA
Diakonikolas, I., Kontonis, V., Tzamos, C. & Zarifis, N.. (2020). Learning Halfspaces with Massart Noise Under Structured Distributions. Proceedings of Thirty Third Conference on Learning Theory, in PMLR 125:1486-1513

Related Material