[edit]
Sparse Hilbert-Schmidt Independence Criterion Regression
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:538-548, 2020.
Abstract
Feature selection is a fundamental problem for machine learning and statistics, and it has been widely studied over the past decades. However, the majority of feature selection algorithms are based on linear models, and the nonlinear feature selection problem has not been well studied compared to linear models, in particular for the high-dimensional case. In this paper, we propose the sparse Hilbert–Schmidt Independence Criterion (SpHSIC) regression, which is a versatile nonlinear feature selection algorithm based on the HSIC and is a continuous optimization variant of the well-known minimum redundancy maximum relevance (mRMR) feature selection algorithm. More specifically, the SpHSIC consists of two parts: the convex HSIC loss function on the one hand and the regularization term on the other hand, where we consider the Lasso, Bridge, MCP, and SCAD penalties. We prove that the sparsity based HSIC regression estimator satisfies the oracle property; that is, the sparsity-based estimator recovers the true underlying sparse model and is asymptotically normally distributed. On the basis of synthetic and real-world experiments, we illustrate this theoretical property and highlight the fact that the proposed algorithm performs well in the high-dimensional setting.