The Feature Selection Path in Kernel Methods

Fuxin Li, Cristian Sminchisescu
; Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, JMLR Workshop and Conference Proceedings 9:445-452, 2010.

Abstract

The problem of automatic feature selection/weighting in kernel methods is examined. We work on a formulation that optimizes both the weights of features and the parameters of the kernel model simultaneously, using L_1 regularization for feature selection. Under quite general choices of kernels, we prove that there exists a unique regularization path for this problem, that runs from 0 to a stationary point of the non-regularized problem. We propose an ODE-based homotopy method to follow this trajectory. By following the path, our algorithm is able to automatically discard irrelevant features and to automatically go back and forth to avoid local optima. Experiments on synthetic and real datasets show that the method achieves low prediction error and is efficient in separating relevant from irrelevant features.

Cite this Paper


BibTeX
@InProceedings{pmlr-v9-li10a, title = {The Feature Selection Path in Kernel Methods}, author = {Fuxin Li and Cristian Sminchisescu}, booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics}, pages = {445--452}, year = {2010}, editor = {Yee Whye Teh and Mike Titterington}, volume = {9}, series = {Proceedings of Machine Learning Research}, address = {Chia Laguna Resort, Sardinia, Italy}, month = {13--15 May}, publisher = {JMLR Workshop and Conference Proceedings}, pdf = {http://proceedings.mlr.press/v9/li10a/li10a.pdf}, url = {http://proceedings.mlr.press/v9/li10a.html}, abstract = {The problem of automatic feature selection/weighting in kernel methods is examined. We work on a formulation that optimizes both the weights of features and the parameters of the kernel model simultaneously, using L_1 regularization for feature selection. Under quite general choices of kernels, we prove that there exists a unique regularization path for this problem, that runs from 0 to a stationary point of the non-regularized problem. We propose an ODE-based homotopy method to follow this trajectory. By following the path, our algorithm is able to automatically discard irrelevant features and to automatically go back and forth to avoid local optima. Experiments on synthetic and real datasets show that the method achieves low prediction error and is efficient in separating relevant from irrelevant features.} }
Endnote
%0 Conference Paper %T The Feature Selection Path in Kernel Methods %A Fuxin Li %A Cristian Sminchisescu %B Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2010 %E Yee Whye Teh %E Mike Titterington %F pmlr-v9-li10a %I PMLR %J Proceedings of Machine Learning Research %P 445--452 %U http://proceedings.mlr.press %V 9 %W PMLR %X The problem of automatic feature selection/weighting in kernel methods is examined. We work on a formulation that optimizes both the weights of features and the parameters of the kernel model simultaneously, using L_1 regularization for feature selection. Under quite general choices of kernels, we prove that there exists a unique regularization path for this problem, that runs from 0 to a stationary point of the non-regularized problem. We propose an ODE-based homotopy method to follow this trajectory. By following the path, our algorithm is able to automatically discard irrelevant features and to automatically go back and forth to avoid local optima. Experiments on synthetic and real datasets show that the method achieves low prediction error and is efficient in separating relevant from irrelevant features.
RIS
TY - CPAPER TI - The Feature Selection Path in Kernel Methods AU - Fuxin Li AU - Cristian Sminchisescu BT - Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics PY - 2010/03/31 DA - 2010/03/31 ED - Yee Whye Teh ED - Mike Titterington ID - pmlr-v9-li10a PB - PMLR SP - 445 DP - PMLR EP - 452 L1 - http://proceedings.mlr.press/v9/li10a/li10a.pdf UR - http://proceedings.mlr.press/v9/li10a.html AB - The problem of automatic feature selection/weighting in kernel methods is examined. We work on a formulation that optimizes both the weights of features and the parameters of the kernel model simultaneously, using L_1 regularization for feature selection. Under quite general choices of kernels, we prove that there exists a unique regularization path for this problem, that runs from 0 to a stationary point of the non-regularized problem. We propose an ODE-based homotopy method to follow this trajectory. By following the path, our algorithm is able to automatically discard irrelevant features and to automatically go back and forth to avoid local optima. Experiments on synthetic and real datasets show that the method achieves low prediction error and is efficient in separating relevant from irrelevant features. ER -
APA
Li, F. & Sminchisescu, C.. (2010). The Feature Selection Path in Kernel Methods. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, in PMLR 9:445-452

Related Material