A New Perspective for Information Theoretic Feature Selection

Gavin Brown
; Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:49-56, 2009.

Abstract

Feature Filters are among the simplest and fastest approaches to feature selection. A “filter” defines a statistical criterion, used to rank features on how useful they are expected to be for classification. The highest ranking features are retained, and the lowest ranking can be discarded. A common approach is to use the Mutual Information between the features and class label. This area has seen a recent flurry of activity, resulting in a confusing variety of heuristic criteria all based on mutual information, and a lack of a principled way to understand or relate them. The contribution of this paper is a unifying theoretical understanding of such filters. In contrast to current methods which manually construct filter criteria with particular properties, we show how to naturally derive a space of possible ranking criteria. We will show that several recent contributions in the feature selection literature are points within this space, and that there exist many points that have never been explored.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-brown09a, title = {A New Perspective for Information Theoretic Feature Selection}, author = {Gavin Brown}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {49--56}, year = {2009}, editor = {David van Dyk and Max Welling}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/brown09a/brown09a.pdf}, url = {http://proceedings.mlr.press/v5/brown09a.html}, abstract = {Feature Filters are among the simplest and fastest approaches to feature selection. A “filter” defines a statistical criterion, used to rank features on how useful they are expected to be for classification. The highest ranking features are retained, and the lowest ranking can be discarded. A common approach is to use the Mutual Information between the features and class label. This area has seen a recent flurry of activity, resulting in a confusing variety of heuristic criteria all based on mutual information, and a lack of a principled way to understand or relate them. The contribution of this paper is a unifying theoretical understanding of such filters. In contrast to current methods which manually construct filter criteria with particular properties, we show how to naturally derive a space of possible ranking criteria. We will show that several recent contributions in the feature selection literature are points within this space, and that there exist many points that have never been explored.} }
Endnote
%0 Conference Paper %T A New Perspective for Information Theoretic Feature Selection %A Gavin Brown %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-brown09a %I PMLR %J Proceedings of Machine Learning Research %P 49--56 %U http://proceedings.mlr.press %V 5 %W PMLR %X Feature Filters are among the simplest and fastest approaches to feature selection. A “filter” defines a statistical criterion, used to rank features on how useful they are expected to be for classification. The highest ranking features are retained, and the lowest ranking can be discarded. A common approach is to use the Mutual Information between the features and class label. This area has seen a recent flurry of activity, resulting in a confusing variety of heuristic criteria all based on mutual information, and a lack of a principled way to understand or relate them. The contribution of this paper is a unifying theoretical understanding of such filters. In contrast to current methods which manually construct filter criteria with particular properties, we show how to naturally derive a space of possible ranking criteria. We will show that several recent contributions in the feature selection literature are points within this space, and that there exist many points that have never been explored.
RIS
TY - CPAPER TI - A New Perspective for Information Theoretic Feature Selection AU - Gavin Brown BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics PY - 2009/04/15 DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-brown09a PB - PMLR SP - 49 DP - PMLR EP - 56 L1 - http://proceedings.mlr.press/v5/brown09a/brown09a.pdf UR - http://proceedings.mlr.press/v5/brown09a.html AB - Feature Filters are among the simplest and fastest approaches to feature selection. A “filter” defines a statistical criterion, used to rank features on how useful they are expected to be for classification. The highest ranking features are retained, and the lowest ranking can be discarded. A common approach is to use the Mutual Information between the features and class label. This area has seen a recent flurry of activity, resulting in a confusing variety of heuristic criteria all based on mutual information, and a lack of a principled way to understand or relate them. The contribution of this paper is a unifying theoretical understanding of such filters. In contrast to current methods which manually construct filter criteria with particular properties, we show how to naturally derive a space of possible ranking criteria. We will show that several recent contributions in the feature selection literature are points within this space, and that there exist many points that have never been explored. ER -
APA
Brown, G.. (2009). A New Perspective for Information Theoretic Feature Selection. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:49-56

Related Material