An Improved 1-norm SVM for Simultaneous Classification and Variable Selection

Hui Zou
; Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:675-681, 2007.

Abstract

We propose a novel extension of the 1-norm support vector machine (SVM) for simultaneous feature selection and classification. The new algorithm penalizes the empirical hinge loss by the adaptively weighted 1-norm penalty in which the weights are computed by the 2-norm SVM. Hence the new algorithm is called the hybrid SVM. Simulation and real data examples show that the hybrid SVM not only often improves upon the 1-norm SVM in terms of classification accuracy but also enjoys better feature selection performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v2-zou07a, title = {An Improved 1-norm SVM for Simultaneous Classification and Variable Selection}, author = {Hui Zou}, booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics}, pages = {675--681}, year = {2007}, editor = {Marina Meila and Xiaotong Shen}, volume = {2}, series = {Proceedings of Machine Learning Research}, address = {San Juan, Puerto Rico}, month = {21--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v2/zou07a/zou07a.pdf}, url = {http://proceedings.mlr.press/v2/zou07a.html}, abstract = {We propose a novel extension of the 1-norm support vector machine (SVM) for simultaneous feature selection and classification. The new algorithm penalizes the empirical hinge loss by the adaptively weighted 1-norm penalty in which the weights are computed by the 2-norm SVM. Hence the new algorithm is called the hybrid SVM. Simulation and real data examples show that the hybrid SVM not only often improves upon the 1-norm SVM in terms of classification accuracy but also enjoys better feature selection performance.} }
Endnote
%0 Conference Paper %T An Improved 1-norm SVM for Simultaneous Classification and Variable Selection %A Hui Zou %B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2007 %E Marina Meila %E Xiaotong Shen %F pmlr-v2-zou07a %I PMLR %J Proceedings of Machine Learning Research %P 675--681 %U http://proceedings.mlr.press %V 2 %W PMLR %X We propose a novel extension of the 1-norm support vector machine (SVM) for simultaneous feature selection and classification. The new algorithm penalizes the empirical hinge loss by the adaptively weighted 1-norm penalty in which the weights are computed by the 2-norm SVM. Hence the new algorithm is called the hybrid SVM. Simulation and real data examples show that the hybrid SVM not only often improves upon the 1-norm SVM in terms of classification accuracy but also enjoys better feature selection performance.
RIS
TY - CPAPER TI - An Improved 1-norm SVM for Simultaneous Classification and Variable Selection AU - Hui Zou BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics PY - 2007/03/11 DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - pmlr-v2-zou07a PB - PMLR SP - 675 DP - PMLR EP - 681 L1 - http://proceedings.mlr.press/v2/zou07a/zou07a.pdf UR - http://proceedings.mlr.press/v2/zou07a.html AB - We propose a novel extension of the 1-norm support vector machine (SVM) for simultaneous feature selection and classification. The new algorithm penalizes the empirical hinge loss by the adaptively weighted 1-norm penalty in which the weights are computed by the 2-norm SVM. Hence the new algorithm is called the hybrid SVM. Simulation and real data examples show that the hybrid SVM not only often improves upon the 1-norm SVM in terms of classification accuracy but also enjoys better feature selection performance. ER -
APA
Zou, H.. (2007). An Improved 1-norm SVM for Simultaneous Classification and Variable Selection. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in PMLR 2:675-681

Related Material