Consistency of Robust Kernel Density Estimators

Robert Vandermeulen, Clayton Scott
Proceedings of the 26th Annual Conference on Learning Theory, PMLR 30:568-591, 2013.

Abstract

The kernel density estimator (KDE) based on a radial positive-semidefinite kernel may be viewed as a sample mean in a reproducing kernel Hilbert space. This mean can be viewed as the solution of a least squares problem in that space. Replacing the squared loss with a robust loss yields a robust kernel density estimator (RKDE). Previous work has shown that RKDEs are weighted kernel density estimators which have desirable robustness properties. In this paper we establish asymptotic L^1 consistency of the RKDE for a class of losses and show that the RKDE converges with the same rate on bandwidth required for the traditional KDE. We also present a novel proof of the consistency of the traditional KDE.

Cite this Paper


BibTeX
@InProceedings{pmlr-v30-Vandermeulen13, title = {Consistency of Robust Kernel Density Estimators}, author = {Vandermeulen, Robert and Scott, Clayton}, booktitle = {Proceedings of the 26th Annual Conference on Learning Theory}, pages = {568--591}, year = {2013}, editor = {Shalev-Shwartz, Shai and Steinwart, Ingo}, volume = {30}, series = {Proceedings of Machine Learning Research}, address = {Princeton, NJ, USA}, month = {12--14 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v30/Vandermeulen13.pdf}, url = {https://proceedings.mlr.press/v30/Vandermeulen13.html}, abstract = {The kernel density estimator (KDE) based on a radial positive-semidefinite kernel may be viewed as a sample mean in a reproducing kernel Hilbert space. This mean can be viewed as the solution of a least squares problem in that space. Replacing the squared loss with a robust loss yields a robust kernel density estimator (RKDE). Previous work has shown that RKDEs are weighted kernel density estimators which have desirable robustness properties. In this paper we establish asymptotic L^1 consistency of the RKDE for a class of losses and show that the RKDE converges with the same rate on bandwidth required for the traditional KDE. We also present a novel proof of the consistency of the traditional KDE.} }
Endnote
%0 Conference Paper %T Consistency of Robust Kernel Density Estimators %A Robert Vandermeulen %A Clayton Scott %B Proceedings of the 26th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2013 %E Shai Shalev-Shwartz %E Ingo Steinwart %F pmlr-v30-Vandermeulen13 %I PMLR %P 568--591 %U https://proceedings.mlr.press/v30/Vandermeulen13.html %V 30 %X The kernel density estimator (KDE) based on a radial positive-semidefinite kernel may be viewed as a sample mean in a reproducing kernel Hilbert space. This mean can be viewed as the solution of a least squares problem in that space. Replacing the squared loss with a robust loss yields a robust kernel density estimator (RKDE). Previous work has shown that RKDEs are weighted kernel density estimators which have desirable robustness properties. In this paper we establish asymptotic L^1 consistency of the RKDE for a class of losses and show that the RKDE converges with the same rate on bandwidth required for the traditional KDE. We also present a novel proof of the consistency of the traditional KDE.
RIS
TY - CPAPER TI - Consistency of Robust Kernel Density Estimators AU - Robert Vandermeulen AU - Clayton Scott BT - Proceedings of the 26th Annual Conference on Learning Theory DA - 2013/06/13 ED - Shai Shalev-Shwartz ED - Ingo Steinwart ID - pmlr-v30-Vandermeulen13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 30 SP - 568 EP - 591 L1 - http://proceedings.mlr.press/v30/Vandermeulen13.pdf UR - https://proceedings.mlr.press/v30/Vandermeulen13.html AB - The kernel density estimator (KDE) based on a radial positive-semidefinite kernel may be viewed as a sample mean in a reproducing kernel Hilbert space. This mean can be viewed as the solution of a least squares problem in that space. Replacing the squared loss with a robust loss yields a robust kernel density estimator (RKDE). Previous work has shown that RKDEs are weighted kernel density estimators which have desirable robustness properties. In this paper we establish asymptotic L^1 consistency of the RKDE for a class of losses and show that the RKDE converges with the same rate on bandwidth required for the traditional KDE. We also present a novel proof of the consistency of the traditional KDE. ER -
APA
Vandermeulen, R. & Scott, C.. (2013). Consistency of Robust Kernel Density Estimators. Proceedings of the 26th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 30:568-591 Available from https://proceedings.mlr.press/v30/Vandermeulen13.html.

Related Material