Efficient variable selection in support vector machines via the alternating direction method of multipliers


Gui–Bo Ye, Yifei Chen, Xiaohui Xie ;
Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, PMLR 15:832-840, 2011.


The support vector machine (SVM) is a widely used tool for classification. Although commonly understood as a method of finding the maximum-margin hyperplane, it can also be formulated as a regularized function estimation problem, corresponding to a hinge loss function plus an l2-norm regulation term. The doubly regularized support vector machine (DrSVM) is a variant of the standard SVM, which introduces an additional l1-norm regularization term on the fitted coefficients. The combined l1 and l2 regularization, termed elastic net penalty, has the interesting property of achieving simultaneous variable selection and margin-maximization within a single framework. However, because of the nonsmoothness of both the loss function and the regularization term, there is no efficient method to solve DrSVM for large scale problems. Here we develop an efficient algorithm based on the alternating direction method of multipliers (ADMM) to solve the optimization problem in DrSVM. The utility of the method is further illustrated using both simulated and real-world datasets.

Related Material