Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation

Hiroaki Sasaki, Yung-Kyun Noh, Masashi Sugiyama
Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, PMLR 38:809-818, 2015.

Abstract

Estimation of density derivatives is a versatile tool in statistical data analysis. A naive approach is to first estimate the density and then compute its derivative. However, such a two-step approach does not work well because a good density estimator does not necessarily mean a good density-derivative estimator. In this paper, we give a direct method to approximate the density derivative without estimating the density itself. Our proposed estimator allows analytic and computationally efficient approximation of multi-dimensional high-order density derivatives, with the ability that all hyper-parameters can be chosen objectively by cross-validation. We further show that the proposed density-derivative estimator is useful in improving the accuracy of non-parametric KL-divergence estimation via metric learning. The practical superiority of the proposed method is experimentally demonstrated in change detection and feature selection.

Cite this Paper


BibTeX
@InProceedings{pmlr-v38-sasaki15, title = {{Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation}}, author = {Sasaki, Hiroaki and Noh, Yung-Kyun and Sugiyama, Masashi}, booktitle = {Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics}, pages = {809--818}, year = {2015}, editor = {Lebanon, Guy and Vishwanathan, S. V. N.}, volume = {38}, series = {Proceedings of Machine Learning Research}, address = {San Diego, California, USA}, month = {09--12 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v38/sasaki15.pdf}, url = {https://proceedings.mlr.press/v38/sasaki15.html}, abstract = {Estimation of density derivatives is a versatile tool in statistical data analysis. A naive approach is to first estimate the density and then compute its derivative. However, such a two-step approach does not work well because a good density estimator does not necessarily mean a good density-derivative estimator. In this paper, we give a direct method to approximate the density derivative without estimating the density itself. Our proposed estimator allows analytic and computationally efficient approximation of multi-dimensional high-order density derivatives, with the ability that all hyper-parameters can be chosen objectively by cross-validation. We further show that the proposed density-derivative estimator is useful in improving the accuracy of non-parametric KL-divergence estimation via metric learning. The practical superiority of the proposed method is experimentally demonstrated in change detection and feature selection.} }
Endnote
%0 Conference Paper %T Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation %A Hiroaki Sasaki %A Yung-Kyun Noh %A Masashi Sugiyama %B Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2015 %E Guy Lebanon %E S. V. N. Vishwanathan %F pmlr-v38-sasaki15 %I PMLR %P 809--818 %U https://proceedings.mlr.press/v38/sasaki15.html %V 38 %X Estimation of density derivatives is a versatile tool in statistical data analysis. A naive approach is to first estimate the density and then compute its derivative. However, such a two-step approach does not work well because a good density estimator does not necessarily mean a good density-derivative estimator. In this paper, we give a direct method to approximate the density derivative without estimating the density itself. Our proposed estimator allows analytic and computationally efficient approximation of multi-dimensional high-order density derivatives, with the ability that all hyper-parameters can be chosen objectively by cross-validation. We further show that the proposed density-derivative estimator is useful in improving the accuracy of non-parametric KL-divergence estimation via metric learning. The practical superiority of the proposed method is experimentally demonstrated in change detection and feature selection.
RIS
TY - CPAPER TI - Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation AU - Hiroaki Sasaki AU - Yung-Kyun Noh AU - Masashi Sugiyama BT - Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics DA - 2015/02/21 ED - Guy Lebanon ED - S. V. N. Vishwanathan ID - pmlr-v38-sasaki15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 38 SP - 809 EP - 818 L1 - http://proceedings.mlr.press/v38/sasaki15.pdf UR - https://proceedings.mlr.press/v38/sasaki15.html AB - Estimation of density derivatives is a versatile tool in statistical data analysis. A naive approach is to first estimate the density and then compute its derivative. However, such a two-step approach does not work well because a good density estimator does not necessarily mean a good density-derivative estimator. In this paper, we give a direct method to approximate the density derivative without estimating the density itself. Our proposed estimator allows analytic and computationally efficient approximation of multi-dimensional high-order density derivatives, with the ability that all hyper-parameters can be chosen objectively by cross-validation. We further show that the proposed density-derivative estimator is useful in improving the accuracy of non-parametric KL-divergence estimation via metric learning. The practical superiority of the proposed method is experimentally demonstrated in change detection and feature selection. ER -
APA
Sasaki, H., Noh, Y. & Sugiyama, M.. (2015). Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation. Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 38:809-818 Available from https://proceedings.mlr.press/v38/sasaki15.html.

Related Material