Jointly Informative Feature Selection

Leonidas Lefakis, Francois Fleuret
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:567-575, 2014.

Abstract

We propose several novel criteria for the selection of groups of jointly informative continuous features in the context of classification. Our approach is based on combining a Gaussian modeling of the feature responses, with derived upper bounds on their mutual information with the class label and their joint entropy. We further propose specific algorithmic implementations of these criteria which reduce the computational complexity of the algorithms by up to two-orders of magnitude, making these strategies tractable in practice. Experiments on multiple computer-vision data-bases, and using several types of classifiers, show that this class of methods outperforms state-of-the-art baselines, both in terms of speed and classification accuracy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v33-lefakis14, title = {{Jointly Informative Feature Selection}}, author = {Lefakis, Leonidas and Fleuret, Francois}, booktitle = {Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics}, pages = {567--575}, year = {2014}, editor = {Kaski, Samuel and Corander, Jukka}, volume = {33}, series = {Proceedings of Machine Learning Research}, address = {Reykjavik, Iceland}, month = {22--25 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v33/lefakis14.pdf}, url = {https://proceedings.mlr.press/v33/lefakis14.html}, abstract = {We propose several novel criteria for the selection of groups of jointly informative continuous features in the context of classification. Our approach is based on combining a Gaussian modeling of the feature responses, with derived upper bounds on their mutual information with the class label and their joint entropy. We further propose specific algorithmic implementations of these criteria which reduce the computational complexity of the algorithms by up to two-orders of magnitude, making these strategies tractable in practice. Experiments on multiple computer-vision data-bases, and using several types of classifiers, show that this class of methods outperforms state-of-the-art baselines, both in terms of speed and classification accuracy.} }
Endnote
%0 Conference Paper %T Jointly Informative Feature Selection %A Leonidas Lefakis %A Francois Fleuret %B Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2014 %E Samuel Kaski %E Jukka Corander %F pmlr-v33-lefakis14 %I PMLR %P 567--575 %U https://proceedings.mlr.press/v33/lefakis14.html %V 33 %X We propose several novel criteria for the selection of groups of jointly informative continuous features in the context of classification. Our approach is based on combining a Gaussian modeling of the feature responses, with derived upper bounds on their mutual information with the class label and their joint entropy. We further propose specific algorithmic implementations of these criteria which reduce the computational complexity of the algorithms by up to two-orders of magnitude, making these strategies tractable in practice. Experiments on multiple computer-vision data-bases, and using several types of classifiers, show that this class of methods outperforms state-of-the-art baselines, both in terms of speed and classification accuracy.
RIS
TY - CPAPER TI - Jointly Informative Feature Selection AU - Leonidas Lefakis AU - Francois Fleuret BT - Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics DA - 2014/04/02 ED - Samuel Kaski ED - Jukka Corander ID - pmlr-v33-lefakis14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 33 SP - 567 EP - 575 L1 - http://proceedings.mlr.press/v33/lefakis14.pdf UR - https://proceedings.mlr.press/v33/lefakis14.html AB - We propose several novel criteria for the selection of groups of jointly informative continuous features in the context of classification. Our approach is based on combining a Gaussian modeling of the feature responses, with derived upper bounds on their mutual information with the class label and their joint entropy. We further propose specific algorithmic implementations of these criteria which reduce the computational complexity of the algorithms by up to two-orders of magnitude, making these strategies tractable in practice. Experiments on multiple computer-vision data-bases, and using several types of classifiers, show that this class of methods outperforms state-of-the-art baselines, both in terms of speed and classification accuracy. ER -
APA
Lefakis, L. & Fleuret, F.. (2014). Jointly Informative Feature Selection. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 33:567-575 Available from https://proceedings.mlr.press/v33/lefakis14.html.

Related Material