Nonparametric Estimation of Conditional Information and Divergences
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:914-923, 2012.
In this paper we propose new nonparametric estimators for a family of conditional mutual information and divergences. Our estimators are easy to compute; they only use simple k nearest neighbor based statistics. We prove that the proposed conditional information and divergence estimators are consistent under certain conditions, and demonstrate their consistency and applicability by numerical experiments on simulated and on real data as well.