A Bayes consistent 1-NN classifier

Aryeh Kontorovich, Roi Weiss
Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, PMLR 38:480-488, 2015.

Abstract

We show that a simple modification of the 1-nearest neighbor classifier yields a strongly Bayes consistent learner. Prior to this work, the only strongly Bayes consistent proximity-based method was the k-nearest neighbor classifier, for k growing appropriately with sample size. We will argue that a margin-regularized 1-NN enjoys considerable statistical and algorithmic advantages over the k-NN classifier. These include user-friendly finite-sample error bounds, as well as time- and memory-efficient learning and test-point evaluation algorithms with a principled speed-accuracy tradeoff. Encouraging empirical results are reported.

Cite this Paper


BibTeX
@InProceedings{pmlr-v38-kontorovich15, title = {{A Bayes consistent 1-NN classifier}}, author = {Kontorovich, Aryeh and Weiss, Roi}, booktitle = {Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics}, pages = {480--488}, year = {2015}, editor = {Lebanon, Guy and Vishwanathan, S. V. N.}, volume = {38}, series = {Proceedings of Machine Learning Research}, address = {San Diego, California, USA}, month = {09--12 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v38/kontorovich15.pdf}, url = {https://proceedings.mlr.press/v38/kontorovich15.html}, abstract = {We show that a simple modification of the 1-nearest neighbor classifier yields a strongly Bayes consistent learner. Prior to this work, the only strongly Bayes consistent proximity-based method was the k-nearest neighbor classifier, for k growing appropriately with sample size. We will argue that a margin-regularized 1-NN enjoys considerable statistical and algorithmic advantages over the k-NN classifier. These include user-friendly finite-sample error bounds, as well as time- and memory-efficient learning and test-point evaluation algorithms with a principled speed-accuracy tradeoff. Encouraging empirical results are reported.} }
Endnote
%0 Conference Paper %T A Bayes consistent 1-NN classifier %A Aryeh Kontorovich %A Roi Weiss %B Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2015 %E Guy Lebanon %E S. V. N. Vishwanathan %F pmlr-v38-kontorovich15 %I PMLR %P 480--488 %U https://proceedings.mlr.press/v38/kontorovich15.html %V 38 %X We show that a simple modification of the 1-nearest neighbor classifier yields a strongly Bayes consistent learner. Prior to this work, the only strongly Bayes consistent proximity-based method was the k-nearest neighbor classifier, for k growing appropriately with sample size. We will argue that a margin-regularized 1-NN enjoys considerable statistical and algorithmic advantages over the k-NN classifier. These include user-friendly finite-sample error bounds, as well as time- and memory-efficient learning and test-point evaluation algorithms with a principled speed-accuracy tradeoff. Encouraging empirical results are reported.
RIS
TY - CPAPER TI - A Bayes consistent 1-NN classifier AU - Aryeh Kontorovich AU - Roi Weiss BT - Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics DA - 2015/02/21 ED - Guy Lebanon ED - S. V. N. Vishwanathan ID - pmlr-v38-kontorovich15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 38 SP - 480 EP - 488 L1 - http://proceedings.mlr.press/v38/kontorovich15.pdf UR - https://proceedings.mlr.press/v38/kontorovich15.html AB - We show that a simple modification of the 1-nearest neighbor classifier yields a strongly Bayes consistent learner. Prior to this work, the only strongly Bayes consistent proximity-based method was the k-nearest neighbor classifier, for k growing appropriately with sample size. We will argue that a margin-regularized 1-NN enjoys considerable statistical and algorithmic advantages over the k-NN classifier. These include user-friendly finite-sample error bounds, as well as time- and memory-efficient learning and test-point evaluation algorithms with a principled speed-accuracy tradeoff. Encouraging empirical results are reported. ER -
APA
Kontorovich, A. & Weiss, R.. (2015). A Bayes consistent 1-NN classifier. Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 38:480-488 Available from https://proceedings.mlr.press/v38/kontorovich15.html.

Related Material