Approximating Mutual Information by Maximum Likelihood Density Ratio Estimation

Taiji Suzuki, Masashi Sugiyama, Jun Sese, Takafumi Kanamori
; Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery at ECML/PKDD 2008, PMLR 4:5-20, 2008.

Abstract

Mutual information is useful in various data processing tasks such as feature selection or independent component analysis. In this paper, we propose a new method of approximating mutual information based on maximum likelihood estimation of a density ratio function. Our method, called Maximum Likelihood Mutual Information (MLMI), has several attractive properties, e.g., density estimation is not involved, it is a single-shot procedure, the global optimal solution can be efficiently computed, and cross-validation is available for model selection. Numerical experiments show that MLMI compares favorably with existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v4-suzuki08a, title = {Approximating Mutual Information by Maximum Likelihood Density Ratio Estimation}, author = {Taiji Suzuki and Masashi Sugiyama and Jun Sese and Takafumi Kanamori}, pages = {5--20}, year = {2008}, editor = {Yvan Saeys and Huan Liu and Iñaki Inza and Louis Wehenkel and Yves Van de Pee}, volume = {4}, series = {Proceedings of Machine Learning Research}, address = {Antwerp, Belgium}, month = {15 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v4/suzuki08a/suzuki08a.pdf}, url = {http://proceedings.mlr.press/v4/suzuki08a.html}, abstract = {Mutual information is useful in various data processing tasks such as feature selection or independent component analysis. In this paper, we propose a new method of approximating mutual information based on maximum likelihood estimation of a density ratio function. Our method, called Maximum Likelihood Mutual Information (MLMI), has several attractive properties, e.g., density estimation is not involved, it is a single-shot procedure, the global optimal solution can be efficiently computed, and cross-validation is available for model selection. Numerical experiments show that MLMI compares favorably with existing methods.} }
Endnote
%0 Conference Paper %T Approximating Mutual Information by Maximum Likelihood Density Ratio Estimation %A Taiji Suzuki %A Masashi Sugiyama %A Jun Sese %A Takafumi Kanamori %B Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery at ECML/PKDD 2008 %C Proceedings of Machine Learning Research %D 2008 %E Yvan Saeys %E Huan Liu %E Iñaki Inza %E Louis Wehenkel %E Yves Van de Pee %F pmlr-v4-suzuki08a %I PMLR %J Proceedings of Machine Learning Research %P 5--20 %U http://proceedings.mlr.press %V 4 %W PMLR %X Mutual information is useful in various data processing tasks such as feature selection or independent component analysis. In this paper, we propose a new method of approximating mutual information based on maximum likelihood estimation of a density ratio function. Our method, called Maximum Likelihood Mutual Information (MLMI), has several attractive properties, e.g., density estimation is not involved, it is a single-shot procedure, the global optimal solution can be efficiently computed, and cross-validation is available for model selection. Numerical experiments show that MLMI compares favorably with existing methods.
RIS
TY - CPAPER TI - Approximating Mutual Information by Maximum Likelihood Density Ratio Estimation AU - Taiji Suzuki AU - Masashi Sugiyama AU - Jun Sese AU - Takafumi Kanamori BT - Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery at ECML/PKDD 2008 PY - 2008/09/11 DA - 2008/09/11 ED - Yvan Saeys ED - Huan Liu ED - Iñaki Inza ED - Louis Wehenkel ED - Yves Van de Pee ID - pmlr-v4-suzuki08a PB - PMLR SP - 5 DP - PMLR EP - 20 L1 - http://proceedings.mlr.press/v4/suzuki08a/suzuki08a.pdf UR - http://proceedings.mlr.press/v4/suzuki08a.html AB - Mutual information is useful in various data processing tasks such as feature selection or independent component analysis. In this paper, we propose a new method of approximating mutual information based on maximum likelihood estimation of a density ratio function. Our method, called Maximum Likelihood Mutual Information (MLMI), has several attractive properties, e.g., density estimation is not involved, it is a single-shot procedure, the global optimal solution can be efficiently computed, and cross-validation is available for model selection. Numerical experiments show that MLMI compares favorably with existing methods. ER -
APA
Suzuki, T., Sugiyama, M., Sese, J. & Kanamori, T.. (2008). Approximating Mutual Information by Maximum Likelihood Density Ratio Estimation. Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery at ECML/PKDD 2008, in PMLR 4:5-20

Related Material