Activized Learning with Uniform Classification Noise

Liu Yang, Steve Hanneke
; Proceedings of the 30th International Conference on Machine Learning, PMLR 28(2):370-378, 2013.

Abstract

We prove that for any VC class, it is possible to transform any passive learning algorithm into an active learning algorithm with strong asymptotic improvements in label complexity for every nontrivial distribution satisfying a uniform classification noise condition. This generalizes a similar result proven by (Hanneke, 2009;2012) for the realizable case, and is the first result establishing that such general improvement guarantees are possible in the presence of restricted types of classification noise.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-yang13c, title = {Activized Learning with Uniform Classification Noise}, author = {Liu Yang and Steve Hanneke}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {370--378}, year = {2013}, editor = {Sanjoy Dasgupta and David McAllester}, volume = {28}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/yang13c.pdf}, url = {http://proceedings.mlr.press/v28/yang13c.html}, abstract = {We prove that for any VC class, it is possible to transform any passive learning algorithm into an active learning algorithm with strong asymptotic improvements in label complexity for every nontrivial distribution satisfying a uniform classification noise condition. This generalizes a similar result proven by (Hanneke, 2009;2012) for the realizable case, and is the first result establishing that such general improvement guarantees are possible in the presence of restricted types of classification noise.} }
Endnote
%0 Conference Paper %T Activized Learning with Uniform Classification Noise %A Liu Yang %A Steve Hanneke %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-yang13c %I PMLR %J Proceedings of Machine Learning Research %P 370--378 %U http://proceedings.mlr.press %V 28 %N 2 %W PMLR %X We prove that for any VC class, it is possible to transform any passive learning algorithm into an active learning algorithm with strong asymptotic improvements in label complexity for every nontrivial distribution satisfying a uniform classification noise condition. This generalizes a similar result proven by (Hanneke, 2009;2012) for the realizable case, and is the first result establishing that such general improvement guarantees are possible in the presence of restricted types of classification noise.
RIS
TY - CPAPER TI - Activized Learning with Uniform Classification Noise AU - Liu Yang AU - Steve Hanneke BT - Proceedings of the 30th International Conference on Machine Learning PY - 2013/02/13 DA - 2013/02/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-yang13c PB - PMLR SP - 370 DP - PMLR EP - 378 L1 - http://proceedings.mlr.press/v28/yang13c.pdf UR - http://proceedings.mlr.press/v28/yang13c.html AB - We prove that for any VC class, it is possible to transform any passive learning algorithm into an active learning algorithm with strong asymptotic improvements in label complexity for every nontrivial distribution satisfying a uniform classification noise condition. This generalizes a similar result proven by (Hanneke, 2009;2012) for the realizable case, and is the first result establishing that such general improvement guarantees are possible in the presence of restricted types of classification noise. ER -
APA
Yang, L. & Hanneke, S.. (2013). Activized Learning with Uniform Classification Noise. Proceedings of the 30th International Conference on Machine Learning, in PMLR 28(2):370-378

Related Material