PLAL: Cluster-based active learning

Ruth Urner, Sharon Wulff, Shai Ben-David
; Proceedings of the 26th Annual Conference on Learning Theory, PMLR 30:376-397, 2013.

Abstract

We investigate the label complexity of active learning under some smoothness assumptions on the data-generating process.We propose a procedure, PLAL, for “activising” passive, sample-based learners. The procedure takes an unlabeledsample, queries the labels of some of its members, and outputs a full labeling of that sample. Assuming the data satisfies “Probabilistic Lipschitzness”, a notion of clusterability, we show that for several common learning paradigms, applying our procedure as a preprocessing leads to provable label complexity reductions (over any “passive”learning algorithm, under the same data assumptions). Our labeling procedure is simple and easy to implement. We complement our theoretical findings with experimental validations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v30-Urner13, title = {PLAL: Cluster-based active learning}, author = {Ruth Urner and Sharon Wulff and Shai Ben-David}, booktitle = {Proceedings of the 26th Annual Conference on Learning Theory}, pages = {376--397}, year = {2013}, editor = {Shai Shalev-Shwartz and Ingo Steinwart}, volume = {30}, series = {Proceedings of Machine Learning Research}, address = {Princeton, NJ, USA}, month = {12--14 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v30/Urner13.pdf}, url = {http://proceedings.mlr.press/v30/Urner13.html}, abstract = {We investigate the label complexity of active learning under some smoothness assumptions on the data-generating process.We propose a procedure, PLAL, for “activising” passive, sample-based learners. The procedure takes an unlabeledsample, queries the labels of some of its members, and outputs a full labeling of that sample. Assuming the data satisfies “Probabilistic Lipschitzness”, a notion of clusterability, we show that for several common learning paradigms, applying our procedure as a preprocessing leads to provable label complexity reductions (over any “passive”learning algorithm, under the same data assumptions). Our labeling procedure is simple and easy to implement. We complement our theoretical findings with experimental validations.} }
Endnote
%0 Conference Paper %T PLAL: Cluster-based active learning %A Ruth Urner %A Sharon Wulff %A Shai Ben-David %B Proceedings of the 26th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2013 %E Shai Shalev-Shwartz %E Ingo Steinwart %F pmlr-v30-Urner13 %I PMLR %J Proceedings of Machine Learning Research %P 376--397 %U http://proceedings.mlr.press %V 30 %W PMLR %X We investigate the label complexity of active learning under some smoothness assumptions on the data-generating process.We propose a procedure, PLAL, for “activising” passive, sample-based learners. The procedure takes an unlabeledsample, queries the labels of some of its members, and outputs a full labeling of that sample. Assuming the data satisfies “Probabilistic Lipschitzness”, a notion of clusterability, we show that for several common learning paradigms, applying our procedure as a preprocessing leads to provable label complexity reductions (over any “passive”learning algorithm, under the same data assumptions). Our labeling procedure is simple and easy to implement. We complement our theoretical findings with experimental validations.
RIS
TY - CPAPER TI - PLAL: Cluster-based active learning AU - Ruth Urner AU - Sharon Wulff AU - Shai Ben-David BT - Proceedings of the 26th Annual Conference on Learning Theory PY - 2013/06/13 DA - 2013/06/13 ED - Shai Shalev-Shwartz ED - Ingo Steinwart ID - pmlr-v30-Urner13 PB - PMLR SP - 376 DP - PMLR EP - 397 L1 - http://proceedings.mlr.press/v30/Urner13.pdf UR - http://proceedings.mlr.press/v30/Urner13.html AB - We investigate the label complexity of active learning under some smoothness assumptions on the data-generating process.We propose a procedure, PLAL, for “activising” passive, sample-based learners. The procedure takes an unlabeledsample, queries the labels of some of its members, and outputs a full labeling of that sample. Assuming the data satisfies “Probabilistic Lipschitzness”, a notion of clusterability, we show that for several common learning paradigms, applying our procedure as a preprocessing leads to provable label complexity reductions (over any “passive”learning algorithm, under the same data assumptions). Our labeling procedure is simple and easy to implement. We complement our theoretical findings with experimental validations. ER -
APA
Urner, R., Wulff, S. & Ben-David, S.. (2013). PLAL: Cluster-based active learning. Proceedings of the 26th Annual Conference on Learning Theory, in PMLR 30:376-397

Related Material