Sparse Hilbert-Schmidt Independence Criterion Regression

Benjamin Poignard, Makoto Yamada
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:538-548, 2020.

Abstract

Feature selection is a fundamental problem for machine learning and statistics, and it has been widely studied over the past decades. However, the majority of feature selection algorithms are based on linear models, and the nonlinear feature selection problem has not been well studied compared to linear models, in particular for the high-dimensional case. In this paper, we propose the sparse Hilbert–Schmidt Independence Criterion (SpHSIC) regression, which is a versatile nonlinear feature selection algorithm based on the HSIC and is a continuous optimization variant of the well-known minimum redundancy maximum relevance (mRMR) feature selection algorithm. More specifically, the SpHSIC consists of two parts: the convex HSIC loss function on the one hand and the regularization term on the other hand, where we consider the Lasso, Bridge, MCP, and SCAD penalties. We prove that the sparsity based HSIC regression estimator satisfies the oracle property; that is, the sparsity-based estimator recovers the true underlying sparse model and is asymptotically normally distributed. On the basis of synthetic and real-world experiments, we illustrate this theoretical property and highlight the fact that the proposed algorithm performs well in the high-dimensional setting.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-poignard20a, title = {Sparse Hilbert-Schmidt Independence Criterion Regression}, author = {Poignard, Benjamin and Yamada, Makoto}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {538--548}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/poignard20a/poignard20a.pdf}, url = {https://proceedings.mlr.press/v108/poignard20a.html}, abstract = {Feature selection is a fundamental problem for machine learning and statistics, and it has been widely studied over the past decades. However, the majority of feature selection algorithms are based on linear models, and the nonlinear feature selection problem has not been well studied compared to linear models, in particular for the high-dimensional case. In this paper, we propose the sparse Hilbert–Schmidt Independence Criterion (SpHSIC) regression, which is a versatile nonlinear feature selection algorithm based on the HSIC and is a continuous optimization variant of the well-known minimum redundancy maximum relevance (mRMR) feature selection algorithm. More specifically, the SpHSIC consists of two parts: the convex HSIC loss function on the one hand and the regularization term on the other hand, where we consider the Lasso, Bridge, MCP, and SCAD penalties. We prove that the sparsity based HSIC regression estimator satisfies the oracle property; that is, the sparsity-based estimator recovers the true underlying sparse model and is asymptotically normally distributed. On the basis of synthetic and real-world experiments, we illustrate this theoretical property and highlight the fact that the proposed algorithm performs well in the high-dimensional setting.} }
Endnote
%0 Conference Paper %T Sparse Hilbert-Schmidt Independence Criterion Regression %A Benjamin Poignard %A Makoto Yamada %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-poignard20a %I PMLR %P 538--548 %U https://proceedings.mlr.press/v108/poignard20a.html %V 108 %X Feature selection is a fundamental problem for machine learning and statistics, and it has been widely studied over the past decades. However, the majority of feature selection algorithms are based on linear models, and the nonlinear feature selection problem has not been well studied compared to linear models, in particular for the high-dimensional case. In this paper, we propose the sparse Hilbert–Schmidt Independence Criterion (SpHSIC) regression, which is a versatile nonlinear feature selection algorithm based on the HSIC and is a continuous optimization variant of the well-known minimum redundancy maximum relevance (mRMR) feature selection algorithm. More specifically, the SpHSIC consists of two parts: the convex HSIC loss function on the one hand and the regularization term on the other hand, where we consider the Lasso, Bridge, MCP, and SCAD penalties. We prove that the sparsity based HSIC regression estimator satisfies the oracle property; that is, the sparsity-based estimator recovers the true underlying sparse model and is asymptotically normally distributed. On the basis of synthetic and real-world experiments, we illustrate this theoretical property and highlight the fact that the proposed algorithm performs well in the high-dimensional setting.
APA
Poignard, B. & Yamada, M.. (2020). Sparse Hilbert-Schmidt Independence Criterion Regression. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:538-548 Available from https://proceedings.mlr.press/v108/poignard20a.html.

Related Material