[edit]
Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:669-677, 2014.
Abstract
Asymptotically unbiased nearest-neighbor estimators for K-L divergence have recently been proposed and demonstrated in a number of applications. With small sample sizes, however, these nonparametric methods typically suffer from high estimation bias due to the non-local statistics of empirical nearest-neighbor information. In this paper, we show that this non-local bias can be mitigated by changing the distance metric, and we propose a method for learning an optimal Mahalanobis-type metric based on global information provided by approximate parametric models of the underlying densities. In both simulations and experiments, we demonstrate that this interplay between parametric models and nonparametric estimation methods significantly improves the accuracy of the nearest-neighbor K-L divergence estimator.