Nuances in Margin Conditions Determine Gains in Active Learning

Samory Kpotufe, Gan Yuan, Yunfan Zhao
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:8112-8126, 2022.

Abstract

We consider nonparametric classification with smooth regression functions, where it is well known that notions of margin in E[Y|X] determine fast or slow rates in both active and passive learning. Here we elucidate a striking distinction between the two settings. Namely, we show that some seemingly benign nuances in notions of margin - involving the uniqueness of the Bayes classifier, and which have no apparent effect on rates in passive learning - determine whether or not any active learner can outperform passive learning rates. In particular, for Audibert-Tsybakov’s margin condition (allowing general situations with non-unique Bayes classifiers), no active learner can gain over passive learning in commonly studied settings where the marginal on X is near uniform. Our results thus negate the usual intuition from past literature that active rates should improve over passive rates in nonparametric settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-kpotufe22a, title = { Nuances in Margin Conditions Determine Gains in Active Learning }, author = {Kpotufe, Samory and Yuan, Gan and Zhao, Yunfan}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {8112--8126}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/kpotufe22a/kpotufe22a.pdf}, url = {https://proceedings.mlr.press/v151/kpotufe22a.html}, abstract = { We consider nonparametric classification with smooth regression functions, where it is well known that notions of margin in E[Y|X] determine fast or slow rates in both active and passive learning. Here we elucidate a striking distinction between the two settings. Namely, we show that some seemingly benign nuances in notions of margin - involving the uniqueness of the Bayes classifier, and which have no apparent effect on rates in passive learning - determine whether or not any active learner can outperform passive learning rates. In particular, for Audibert-Tsybakov’s margin condition (allowing general situations with non-unique Bayes classifiers), no active learner can gain over passive learning in commonly studied settings where the marginal on X is near uniform. Our results thus negate the usual intuition from past literature that active rates should improve over passive rates in nonparametric settings. } }
Endnote
%0 Conference Paper %T Nuances in Margin Conditions Determine Gains in Active Learning %A Samory Kpotufe %A Gan Yuan %A Yunfan Zhao %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-kpotufe22a %I PMLR %P 8112--8126 %U https://proceedings.mlr.press/v151/kpotufe22a.html %V 151 %X We consider nonparametric classification with smooth regression functions, where it is well known that notions of margin in E[Y|X] determine fast or slow rates in both active and passive learning. Here we elucidate a striking distinction between the two settings. Namely, we show that some seemingly benign nuances in notions of margin - involving the uniqueness of the Bayes classifier, and which have no apparent effect on rates in passive learning - determine whether or not any active learner can outperform passive learning rates. In particular, for Audibert-Tsybakov’s margin condition (allowing general situations with non-unique Bayes classifiers), no active learner can gain over passive learning in commonly studied settings where the marginal on X is near uniform. Our results thus negate the usual intuition from past literature that active rates should improve over passive rates in nonparametric settings.
APA
Kpotufe, S., Yuan, G. & Zhao, Y.. (2022). Nuances in Margin Conditions Determine Gains in Active Learning . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:8112-8126 Available from https://proceedings.mlr.press/v151/kpotufe22a.html.

Related Material