Adaptivity to Noise Parameters in Nonparametric Active Learning

Carpentier Alexandra Locatelli Andrea, Kpotufe Samory
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:1383-1416, 2017.

Abstract

This work addresses various open questions in the theory of active learning for nonparametric classification. Our contributions are both statistical and algorithmic: \beginitemize \item We establish new minimax-rates for active learning under common noise conditions. These rates display interesting transitions – due to the interaction between noise smoothness and margin – not present in the passive setting. Some such transitions were previously conjectured, but remained unconfirmed. \item We present a generic algorithmic strategy for adaptivity to unknown noise smoothness and margin; our strategy achieves optimal rates in many general situations; furthermore, unlike in previous work, we avoid the need for adaptive confidence sets, resulting in strictly milder distributional requirements. \enditemize

Cite this Paper


BibTeX
@InProceedings{pmlr-v65-locatelli andrea17a, title = {Adaptivity to Noise Parameters in Nonparametric Active Learning}, author = {Locatelli Andrea, Carpentier Alexandra and Samory, Kpotufe}, booktitle = {Proceedings of the 2017 Conference on Learning Theory}, pages = {1383--1416}, year = {2017}, editor = {Kale, Satyen and Shamir, Ohad}, volume = {65}, series = {Proceedings of Machine Learning Research}, month = {07--10 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v65/locatelli andrea17a/locatelli andrea17a.pdf}, url = {https://proceedings.mlr.press/v65/locatelli-andrea17a.html}, abstract = {This work addresses various open questions in the theory of active learning for nonparametric classification. Our contributions are both statistical and algorithmic: \beginitemize \item We establish new minimax-rates for active learning under common noise conditions. These rates display interesting transitions – due to the interaction between noise smoothness and margin – not present in the passive setting. Some such transitions were previously conjectured, but remained unconfirmed. \item We present a generic algorithmic strategy for adaptivity to unknown noise smoothness and margin; our strategy achieves optimal rates in many general situations; furthermore, unlike in previous work, we avoid the need for adaptive confidence sets, resulting in strictly milder distributional requirements. \enditemize} }
Endnote
%0 Conference Paper %T Adaptivity to Noise Parameters in Nonparametric Active Learning %A Carpentier Alexandra Locatelli Andrea %A Kpotufe Samory %B Proceedings of the 2017 Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2017 %E Satyen Kale %E Ohad Shamir %F pmlr-v65-locatelli andrea17a %I PMLR %P 1383--1416 %U https://proceedings.mlr.press/v65/locatelli-andrea17a.html %V 65 %X This work addresses various open questions in the theory of active learning for nonparametric classification. Our contributions are both statistical and algorithmic: \beginitemize \item We establish new minimax-rates for active learning under common noise conditions. These rates display interesting transitions – due to the interaction between noise smoothness and margin – not present in the passive setting. Some such transitions were previously conjectured, but remained unconfirmed. \item We present a generic algorithmic strategy for adaptivity to unknown noise smoothness and margin; our strategy achieves optimal rates in many general situations; furthermore, unlike in previous work, we avoid the need for adaptive confidence sets, resulting in strictly milder distributional requirements. \enditemize
APA
Locatelli Andrea, C.A. & Samory, K.. (2017). Adaptivity to Noise Parameters in Nonparametric Active Learning. Proceedings of the 2017 Conference on Learning Theory, in Proceedings of Machine Learning Research 65:1383-1416 Available from https://proceedings.mlr.press/v65/locatelli-andrea17a.html.

Related Material