A New Perspective for Information Theoretic Feature Selection

Gavin Brown
Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, PMLR 5:49-56, 2009.

Abstract

Feature Filters are among the simplest and fastest approaches to feature selection. A “filter” defines a statistical criterion, used to rank features on how useful they are expected to be for classification. The highest ranking features are retained, and the lowest ranking can be discarded. A common approach is to use the Mutual Information between the features and class label. This area has seen a recent flurry of activity, resulting in a confusing variety of heuristic criteria all based on mutual information, and a lack of a principled way to understand or relate them. The contribution of this paper is a unifying theoretical understanding of such filters. In contrast to current methods which manually construct filter criteria with particular properties, we show how to naturally derive a space of possible ranking criteria. We will show that several recent contributions in the feature selection literature are points within this space, and that there exist many points that have never been explored.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-brown09a, title = {A New Perspective for Information Theoretic Feature Selection}, author = {Brown, Gavin}, booktitle = {Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics}, pages = {49--56}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/brown09a/brown09a.pdf}, url = {https://proceedings.mlr.press/v5/brown09a.html}, abstract = {Feature Filters are among the simplest and fastest approaches to feature selection. A “filter” defines a statistical criterion, used to rank features on how useful they are expected to be for classification. The highest ranking features are retained, and the lowest ranking can be discarded. A common approach is to use the Mutual Information between the features and class label. This area has seen a recent flurry of activity, resulting in a confusing variety of heuristic criteria all based on mutual information, and a lack of a principled way to understand or relate them. The contribution of this paper is a unifying theoretical understanding of such filters. In contrast to current methods which manually construct filter criteria with particular properties, we show how to naturally derive a space of possible ranking criteria. We will show that several recent contributions in the feature selection literature are points within this space, and that there exist many points that have never been explored.} }
Endnote
%0 Conference Paper %T A New Perspective for Information Theoretic Feature Selection %A Gavin Brown %B Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-brown09a %I PMLR %P 49--56 %U https://proceedings.mlr.press/v5/brown09a.html %V 5 %X Feature Filters are among the simplest and fastest approaches to feature selection. A “filter” defines a statistical criterion, used to rank features on how useful they are expected to be for classification. The highest ranking features are retained, and the lowest ranking can be discarded. A common approach is to use the Mutual Information between the features and class label. This area has seen a recent flurry of activity, resulting in a confusing variety of heuristic criteria all based on mutual information, and a lack of a principled way to understand or relate them. The contribution of this paper is a unifying theoretical understanding of such filters. In contrast to current methods which manually construct filter criteria with particular properties, we show how to naturally derive a space of possible ranking criteria. We will show that several recent contributions in the feature selection literature are points within this space, and that there exist many points that have never been explored.
RIS
TY - CPAPER TI - A New Perspective for Information Theoretic Feature Selection AU - Gavin Brown BT - Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-brown09a PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 49 EP - 56 L1 - http://proceedings.mlr.press/v5/brown09a/brown09a.pdf UR - https://proceedings.mlr.press/v5/brown09a.html AB - Feature Filters are among the simplest and fastest approaches to feature selection. A “filter” defines a statistical criterion, used to rank features on how useful they are expected to be for classification. The highest ranking features are retained, and the lowest ranking can be discarded. A common approach is to use the Mutual Information between the features and class label. This area has seen a recent flurry of activity, resulting in a confusing variety of heuristic criteria all based on mutual information, and a lack of a principled way to understand or relate them. The contribution of this paper is a unifying theoretical understanding of such filters. In contrast to current methods which manually construct filter criteria with particular properties, we show how to naturally derive a space of possible ranking criteria. We will show that several recent contributions in the feature selection literature are points within this space, and that there exist many points that have never been explored. ER -
APA
Brown, G.. (2009). A New Perspective for Information Theoretic Feature Selection. Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:49-56 Available from https://proceedings.mlr.press/v5/brown09a.html.

Related Material