Supervised Feature Selection via Ensemble Gradient Information from Sparse Neural Networks

Kaiting Liu, Zahra Atashgahi, Ghada Sokar, Mykola Pechenizkiy, Decebal Constantin Mocanu
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:3952-3960, 2024.

Abstract

Feature selection algorithms aim to select a subset of informative features from a dataset to reduce the data dimensionality, consequently saving resource consumption and improving the model’s performance and interpretability. In recent years, feature selection based on neural networks has become a new trend, demonstrating superiority over traditional feature selection methods. However, most existing methods use dense neural networks to detect informative features, which requires significant computational and memory overhead. In this paper, taking inspiration from the successful application of local sensitivity analysis on neural networks, we propose a novel resource-efficient supervised feature selection algorithm based on sparse multi-layer perceptron called “GradEnFS". By utilizing the gradient information of various sparse models from different training iterations, our method successfully detects the informative feature subset. We performed extensive experiments on nine classification datasets spanning various domains to evaluate the effectiveness of our method. The results demonstrate that our proposed approach outperforms the state-of-the-art methods in terms of selecting informative features while saving resource consumption substantially. Moreover, we show that using a sparse neural network for feature selection not only alleviates resource consumption but also has a significant advantage over other methods when performing feature selection on noisy datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-liu24f, title = { Supervised Feature Selection via Ensemble Gradient Information from Sparse Neural Networks }, author = {Liu, Kaiting and Atashgahi, Zahra and Sokar, Ghada and Pechenizkiy, Mykola and Mocanu, Decebal Constantin}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {3952--3960}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/liu24f/liu24f.pdf}, url = {https://proceedings.mlr.press/v238/liu24f.html}, abstract = { Feature selection algorithms aim to select a subset of informative features from a dataset to reduce the data dimensionality, consequently saving resource consumption and improving the model’s performance and interpretability. In recent years, feature selection based on neural networks has become a new trend, demonstrating superiority over traditional feature selection methods. However, most existing methods use dense neural networks to detect informative features, which requires significant computational and memory overhead. In this paper, taking inspiration from the successful application of local sensitivity analysis on neural networks, we propose a novel resource-efficient supervised feature selection algorithm based on sparse multi-layer perceptron called “GradEnFS". By utilizing the gradient information of various sparse models from different training iterations, our method successfully detects the informative feature subset. We performed extensive experiments on nine classification datasets spanning various domains to evaluate the effectiveness of our method. The results demonstrate that our proposed approach outperforms the state-of-the-art methods in terms of selecting informative features while saving resource consumption substantially. Moreover, we show that using a sparse neural network for feature selection not only alleviates resource consumption but also has a significant advantage over other methods when performing feature selection on noisy datasets. } }
Endnote
%0 Conference Paper %T Supervised Feature Selection via Ensemble Gradient Information from Sparse Neural Networks %A Kaiting Liu %A Zahra Atashgahi %A Ghada Sokar %A Mykola Pechenizkiy %A Decebal Constantin Mocanu %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-liu24f %I PMLR %P 3952--3960 %U https://proceedings.mlr.press/v238/liu24f.html %V 238 %X Feature selection algorithms aim to select a subset of informative features from a dataset to reduce the data dimensionality, consequently saving resource consumption and improving the model’s performance and interpretability. In recent years, feature selection based on neural networks has become a new trend, demonstrating superiority over traditional feature selection methods. However, most existing methods use dense neural networks to detect informative features, which requires significant computational and memory overhead. In this paper, taking inspiration from the successful application of local sensitivity analysis on neural networks, we propose a novel resource-efficient supervised feature selection algorithm based on sparse multi-layer perceptron called “GradEnFS". By utilizing the gradient information of various sparse models from different training iterations, our method successfully detects the informative feature subset. We performed extensive experiments on nine classification datasets spanning various domains to evaluate the effectiveness of our method. The results demonstrate that our proposed approach outperforms the state-of-the-art methods in terms of selecting informative features while saving resource consumption substantially. Moreover, we show that using a sparse neural network for feature selection not only alleviates resource consumption but also has a significant advantage over other methods when performing feature selection on noisy datasets.
APA
Liu, K., Atashgahi, Z., Sokar, G., Pechenizkiy, M. & Mocanu, D.C.. (2024). Supervised Feature Selection via Ensemble Gradient Information from Sparse Neural Networks . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:3952-3960 Available from https://proceedings.mlr.press/v238/liu24f.html.

Related Material