Consistent and Efficient Nonparametric Different-Feature Selection

Satoshi Hara, Takayuki Katsuki, Hiroki Yanagisawa, Takafumi Ono, Ryo Okamoto, Shigeki Takeuchi
Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, PMLR 54:130-138, 2017.

Abstract

Two-sample feature selection is a ubiquitous problem in both scientific and engineering studies. We propose a feature selection method to find features that describe a difference in two probability distributions. The proposed method is nonparametric and does not assume any specific parametric models on data distributions. We show that the proposed method is computationally efficient and does not require any extra computation for model selection. Moreover, we prove that the proposed method provides a consistent estimator of features under mild conditions. Our experimental results show that the proposed method outperforms the current method with regard to both accuracy and computation time.

Cite this Paper


BibTeX
@InProceedings{pmlr-v54-hara17a, title = {{Consistent and Efficient Nonparametric Different-Feature Selection}}, author = {Hara, Satoshi and Katsuki, Takayuki and Yanagisawa, Hiroki and Ono, Takafumi and Okamoto, Ryo and Takeuchi, Shigeki}, booktitle = {Proceedings of the 20th International Conference on Artificial Intelligence and Statistics}, pages = {130--138}, year = {2017}, editor = {Singh, Aarti and Zhu, Jerry}, volume = {54}, series = {Proceedings of Machine Learning Research}, month = {20--22 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v54/hara17a/hara17a.pdf}, url = {https://proceedings.mlr.press/v54/hara17a.html}, abstract = {Two-sample feature selection is a ubiquitous problem in both scientific and engineering studies. We propose a feature selection method to find features that describe a difference in two probability distributions. The proposed method is nonparametric and does not assume any specific parametric models on data distributions. We show that the proposed method is computationally efficient and does not require any extra computation for model selection. Moreover, we prove that the proposed method provides a consistent estimator of features under mild conditions. Our experimental results show that the proposed method outperforms the current method with regard to both accuracy and computation time.} }
Endnote
%0 Conference Paper %T Consistent and Efficient Nonparametric Different-Feature Selection %A Satoshi Hara %A Takayuki Katsuki %A Hiroki Yanagisawa %A Takafumi Ono %A Ryo Okamoto %A Shigeki Takeuchi %B Proceedings of the 20th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2017 %E Aarti Singh %E Jerry Zhu %F pmlr-v54-hara17a %I PMLR %P 130--138 %U https://proceedings.mlr.press/v54/hara17a.html %V 54 %X Two-sample feature selection is a ubiquitous problem in both scientific and engineering studies. We propose a feature selection method to find features that describe a difference in two probability distributions. The proposed method is nonparametric and does not assume any specific parametric models on data distributions. We show that the proposed method is computationally efficient and does not require any extra computation for model selection. Moreover, we prove that the proposed method provides a consistent estimator of features under mild conditions. Our experimental results show that the proposed method outperforms the current method with regard to both accuracy and computation time.
APA
Hara, S., Katsuki, T., Yanagisawa, H., Ono, T., Okamoto, R. & Takeuchi, S.. (2017). Consistent and Efficient Nonparametric Different-Feature Selection. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 54:130-138 Available from https://proceedings.mlr.press/v54/hara17a.html.

Related Material