The Feature Selection Path in Kernel Methods

Fuxin Li, Cristian Sminchisescu
Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, PMLR 9:445-452, 2010.

Abstract

The problem of automatic feature selection/weighting in kernel methods is examined. We work on a formulation that optimizes both the weights of features and the parameters of the kernel model simultaneously, using $L_1$ regularization for feature selection. Under quite general choices of kernels, we prove that there exists a unique regularization path for this problem, that runs from 0 to a stationary point of the non-regularized problem. We propose an ODE-based homotopy method to follow this trajectory. By following the path, our algorithm is able to automatically discard irrelevant features and to automatically go back and forth to avoid local optima. Experiments on synthetic and real datasets show that the method achieves low prediction error and is efficient in separating relevant from irrelevant features.

Cite this Paper


BibTeX
@InProceedings{pmlr-v9-li10a, title = {The Feature Selection Path in Kernel Methods}, author = {Li, Fuxin and Sminchisescu, Cristian}, booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics}, pages = {445--452}, year = {2010}, editor = {Teh, Yee Whye and Titterington, Mike}, volume = {9}, series = {Proceedings of Machine Learning Research}, address = {Chia Laguna Resort, Sardinia, Italy}, month = {13--15 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v9/li10a/li10a.pdf}, url = {https://proceedings.mlr.press/v9/li10a.html}, abstract = {The problem of automatic feature selection/weighting in kernel methods is examined. We work on a formulation that optimizes both the weights of features and the parameters of the kernel model simultaneously, using $L_1$ regularization for feature selection. Under quite general choices of kernels, we prove that there exists a unique regularization path for this problem, that runs from 0 to a stationary point of the non-regularized problem. We propose an ODE-based homotopy method to follow this trajectory. By following the path, our algorithm is able to automatically discard irrelevant features and to automatically go back and forth to avoid local optima. Experiments on synthetic and real datasets show that the method achieves low prediction error and is efficient in separating relevant from irrelevant features.} }
Endnote
%0 Conference Paper %T The Feature Selection Path in Kernel Methods %A Fuxin Li %A Cristian Sminchisescu %B Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2010 %E Yee Whye Teh %E Mike Titterington %F pmlr-v9-li10a %I PMLR %P 445--452 %U https://proceedings.mlr.press/v9/li10a.html %V 9 %X The problem of automatic feature selection/weighting in kernel methods is examined. We work on a formulation that optimizes both the weights of features and the parameters of the kernel model simultaneously, using $L_1$ regularization for feature selection. Under quite general choices of kernels, we prove that there exists a unique regularization path for this problem, that runs from 0 to a stationary point of the non-regularized problem. We propose an ODE-based homotopy method to follow this trajectory. By following the path, our algorithm is able to automatically discard irrelevant features and to automatically go back and forth to avoid local optima. Experiments on synthetic and real datasets show that the method achieves low prediction error and is efficient in separating relevant from irrelevant features.
RIS
TY - CPAPER TI - The Feature Selection Path in Kernel Methods AU - Fuxin Li AU - Cristian Sminchisescu BT - Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics DA - 2010/03/31 ED - Yee Whye Teh ED - Mike Titterington ID - pmlr-v9-li10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 9 SP - 445 EP - 452 L1 - http://proceedings.mlr.press/v9/li10a/li10a.pdf UR - https://proceedings.mlr.press/v9/li10a.html AB - The problem of automatic feature selection/weighting in kernel methods is examined. We work on a formulation that optimizes both the weights of features and the parameters of the kernel model simultaneously, using $L_1$ regularization for feature selection. Under quite general choices of kernels, we prove that there exists a unique regularization path for this problem, that runs from 0 to a stationary point of the non-regularized problem. We propose an ODE-based homotopy method to follow this trajectory. By following the path, our algorithm is able to automatically discard irrelevant features and to automatically go back and forth to avoid local optima. Experiments on synthetic and real datasets show that the method achieves low prediction error and is efficient in separating relevant from irrelevant features. ER -
APA
Li, F. & Sminchisescu, C.. (2010). The Feature Selection Path in Kernel Methods. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 9:445-452 Available from https://proceedings.mlr.press/v9/li10a.html.

Related Material