Uniform Convergence Rates for Kernel Density Estimation

Heinrich Jiang
; Proceedings of the 34th International Conference on Machine Learning, PMLR 70:1694-1703, 2017.

Abstract

Kernel density estimation (KDE) is a popular nonparametric density estimation method. We (1) derive finite-sample high-probability density estimation bounds for multivariate KDE under mild density assumptions which hold uniformly in $x \in \mathbb{R}^d$ and bandwidth matrices. We apply these results to (2) mode, (3) density level set, and (4) class probability estimation and attain optimal rates up to logarithmic factors. We then (5) provide an extension of our results under the manifold hypothesis. Finally, we (6) give uniform convergence results for local intrinsic dimension estimation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-jiang17b, title = {Uniform Convergence Rates for Kernel Density Estimation}, author = {Heinrich Jiang}, pages = {1694--1703}, year = {2017}, editor = {Doina Precup and Yee Whye Teh}, volume = {70}, series = {Proceedings of Machine Learning Research}, address = {International Convention Centre, Sydney, Australia}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/jiang17b/jiang17b.pdf}, url = {http://proceedings.mlr.press/v70/jiang17b.html}, abstract = {Kernel density estimation (KDE) is a popular nonparametric density estimation method. We (1) derive finite-sample high-probability density estimation bounds for multivariate KDE under mild density assumptions which hold uniformly in $x \in \mathbb{R}^d$ and bandwidth matrices. We apply these results to (2) mode, (3) density level set, and (4) class probability estimation and attain optimal rates up to logarithmic factors. We then (5) provide an extension of our results under the manifold hypothesis. Finally, we (6) give uniform convergence results for local intrinsic dimension estimation.} }
Endnote
%0 Conference Paper %T Uniform Convergence Rates for Kernel Density Estimation %A Heinrich Jiang %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-jiang17b %I PMLR %J Proceedings of Machine Learning Research %P 1694--1703 %U http://proceedings.mlr.press %V 70 %W PMLR %X Kernel density estimation (KDE) is a popular nonparametric density estimation method. We (1) derive finite-sample high-probability density estimation bounds for multivariate KDE under mild density assumptions which hold uniformly in $x \in \mathbb{R}^d$ and bandwidth matrices. We apply these results to (2) mode, (3) density level set, and (4) class probability estimation and attain optimal rates up to logarithmic factors. We then (5) provide an extension of our results under the manifold hypothesis. Finally, we (6) give uniform convergence results for local intrinsic dimension estimation.
APA
Jiang, H.. (2017). Uniform Convergence Rates for Kernel Density Estimation. Proceedings of the 34th International Conference on Machine Learning, in PMLR 70:1694-1703

Related Material