Think Global, Adapt Local: Learning Locally Adaptive K-Nearest Neighbor Kernel Density Estimators

Kenny Olsen, Rasmus M. Hoeegh Lindrup, Morten Mørup
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:4114-4122, 2024.

Abstract

Kernel density estimation (KDE) is a powerful technique for non-parametric density estimation, yet practical use of KDE-based methods remains limited by insufficient representational flexibility, especially for higher-dimensional data. Contrary to KDE, K-nearest neighbor (KNN) density estimation procedures locally adapt the density based on the K-nearest neighborhood, but unfortunately only provide asymptotically correct density estimates. We present the KNN-KDE method introducing observation-specific kernels for KDE that are locally adapted through priors defined by the covariance of the K-nearest neighborhood, forming a fully Bayesian model with exact density estimates. We further derive a scalable inference procedure that infers parameters through variational inference by optimizing the predictive likelihood exploiting sparsity, batched optimization, and parallel computation for massive inference speedups. We find that KNN-KDE provides valid density estimates superior to conventional KDE and KNN density estimation on both synthetic and real data sets. We further observe that the bayesian KNN-KDE even outperforms recent neural density estimation procedures on two of the five considered real data sets. The KNN-KDE unifies conventional kernel and KNN density estimation providing a scalable, generic and accurate framework for density estimation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-olsen24a, title = { Think Global, Adapt Local: Learning Locally Adaptive {K}-Nearest Neighbor Kernel Density Estimators }, author = {Olsen, Kenny and M. Hoeegh Lindrup, Rasmus and M\o{}rup, Morten}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {4114--4122}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/olsen24a/olsen24a.pdf}, url = {https://proceedings.mlr.press/v238/olsen24a.html}, abstract = { Kernel density estimation (KDE) is a powerful technique for non-parametric density estimation, yet practical use of KDE-based methods remains limited by insufficient representational flexibility, especially for higher-dimensional data. Contrary to KDE, K-nearest neighbor (KNN) density estimation procedures locally adapt the density based on the K-nearest neighborhood, but unfortunately only provide asymptotically correct density estimates. We present the KNN-KDE method introducing observation-specific kernels for KDE that are locally adapted through priors defined by the covariance of the K-nearest neighborhood, forming a fully Bayesian model with exact density estimates. We further derive a scalable inference procedure that infers parameters through variational inference by optimizing the predictive likelihood exploiting sparsity, batched optimization, and parallel computation for massive inference speedups. We find that KNN-KDE provides valid density estimates superior to conventional KDE and KNN density estimation on both synthetic and real data sets. We further observe that the bayesian KNN-KDE even outperforms recent neural density estimation procedures on two of the five considered real data sets. The KNN-KDE unifies conventional kernel and KNN density estimation providing a scalable, generic and accurate framework for density estimation. } }
Endnote
%0 Conference Paper %T Think Global, Adapt Local: Learning Locally Adaptive K-Nearest Neighbor Kernel Density Estimators %A Kenny Olsen %A Rasmus M. Hoeegh Lindrup %A Morten Mørup %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-olsen24a %I PMLR %P 4114--4122 %U https://proceedings.mlr.press/v238/olsen24a.html %V 238 %X Kernel density estimation (KDE) is a powerful technique for non-parametric density estimation, yet practical use of KDE-based methods remains limited by insufficient representational flexibility, especially for higher-dimensional data. Contrary to KDE, K-nearest neighbor (KNN) density estimation procedures locally adapt the density based on the K-nearest neighborhood, but unfortunately only provide asymptotically correct density estimates. We present the KNN-KDE method introducing observation-specific kernels for KDE that are locally adapted through priors defined by the covariance of the K-nearest neighborhood, forming a fully Bayesian model with exact density estimates. We further derive a scalable inference procedure that infers parameters through variational inference by optimizing the predictive likelihood exploiting sparsity, batched optimization, and parallel computation for massive inference speedups. We find that KNN-KDE provides valid density estimates superior to conventional KDE and KNN density estimation on both synthetic and real data sets. We further observe that the bayesian KNN-KDE even outperforms recent neural density estimation procedures on two of the five considered real data sets. The KNN-KDE unifies conventional kernel and KNN density estimation providing a scalable, generic and accurate framework for density estimation.
APA
Olsen, K., M. Hoeegh Lindrup, R. & Mørup, M.. (2024). Think Global, Adapt Local: Learning Locally Adaptive K-Nearest Neighbor Kernel Density Estimators . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:4114-4122 Available from https://proceedings.mlr.press/v238/olsen24a.html.

Related Material