Activized Learning with Uniform Classification Noise

Liu Yang, Steve Hanneke
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(2):370-378, 2013.

Abstract

We prove that for any VC class, it is possible to transform any passive learning algorithm into an active learning algorithm with strong asymptotic improvements in label complexity for every nontrivial distribution satisfying a uniform classification noise condition. This generalizes a similar result proven by (Hanneke, 2009;2012) for the realizable case, and is the first result establishing that such general improvement guarantees are possible in the presence of restricted types of classification noise.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-yang13c, title = {Activized Learning with Uniform Classification Noise}, author = {Yang, Liu and Hanneke, Steve}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {370--378}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/yang13c.pdf}, url = {https://proceedings.mlr.press/v28/yang13c.html}, abstract = {We prove that for any VC class, it is possible to transform any passive learning algorithm into an active learning algorithm with strong asymptotic improvements in label complexity for every nontrivial distribution satisfying a uniform classification noise condition. This generalizes a similar result proven by (Hanneke, 2009;2012) for the realizable case, and is the first result establishing that such general improvement guarantees are possible in the presence of restricted types of classification noise.} }
Endnote
%0 Conference Paper %T Activized Learning with Uniform Classification Noise %A Liu Yang %A Steve Hanneke %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-yang13c %I PMLR %P 370--378 %U https://proceedings.mlr.press/v28/yang13c.html %V 28 %N 2 %X We prove that for any VC class, it is possible to transform any passive learning algorithm into an active learning algorithm with strong asymptotic improvements in label complexity for every nontrivial distribution satisfying a uniform classification noise condition. This generalizes a similar result proven by (Hanneke, 2009;2012) for the realizable case, and is the first result establishing that such general improvement guarantees are possible in the presence of restricted types of classification noise.
RIS
TY - CPAPER TI - Activized Learning with Uniform Classification Noise AU - Liu Yang AU - Steve Hanneke BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-yang13c PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 2 SP - 370 EP - 378 L1 - http://proceedings.mlr.press/v28/yang13c.pdf UR - https://proceedings.mlr.press/v28/yang13c.html AB - We prove that for any VC class, it is possible to transform any passive learning algorithm into an active learning algorithm with strong asymptotic improvements in label complexity for every nontrivial distribution satisfying a uniform classification noise condition. This generalizes a similar result proven by (Hanneke, 2009;2012) for the realizable case, and is the first result establishing that such general improvement guarantees are possible in the presence of restricted types of classification noise. ER -
APA
Yang, L. & Hanneke, S.. (2013). Activized Learning with Uniform Classification Noise. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(2):370-378 Available from https://proceedings.mlr.press/v28/yang13c.html.

Related Material