Convex formulations of radius-margin based Support Vector Machines

Huyen Do, Alexandros Kalousis
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(1):169-177, 2013.

Abstract

We consider Support Vector Machines (SVMs) learned together with linear transformations of the feature spaces on which they are applied. Under this scenario the radius of the smallest data enclosing sphere is no longer fixed. Therefore optimizing the SVM error bound by considering both the radius and the margin has the potential to deliver a tighter error bound. In this paper we present two novel algorithms: R-SVM_μ^+—a SVM radius-margin based feature selection algorithm, and R-SVM^+ — a metric learning-based SVM. We derive our algorithms by exploiting a new tighter approximation of the radius and a metric learning interpretation of SVM. Both optimize directly the radius-margin error bound using linear transformations. Unlike almost all existing radius-margin based SVM algorithms which are either non-convex or combinatorial, our algorithms are standard quadratic convex optimization problems with linear or quadratic constraints. We perform a number of experiments on benchmark datasets. R-SVM_μ^+ exhibits excellent feature selection performance compared to the state-of-the-art feature selection methods, such as L_1-norm and elastic-net based methods. R-SVM^+ achieves a significantly better classification performance compared to SVM and its other state-of-the-art variants. From the results it is clear that the incorporation of the radius, as a means to control the data spread, in the cost function has strong beneficial effects.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-do13, title = {Convex formulations of radius-margin based Support Vector Machines}, author = {Do, Huyen and Kalousis, Alexandros}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {169--177}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/do13.pdf}, url = {https://proceedings.mlr.press/v28/do13.html}, abstract = {We consider Support Vector Machines (SVMs) learned together with linear transformations of the feature spaces on which they are applied. Under this scenario the radius of the smallest data enclosing sphere is no longer fixed. Therefore optimizing the SVM error bound by considering both the radius and the margin has the potential to deliver a tighter error bound. In this paper we present two novel algorithms: R-SVM_μ^+—a SVM radius-margin based feature selection algorithm, and R-SVM^+ — a metric learning-based SVM. We derive our algorithms by exploiting a new tighter approximation of the radius and a metric learning interpretation of SVM. Both optimize directly the radius-margin error bound using linear transformations. Unlike almost all existing radius-margin based SVM algorithms which are either non-convex or combinatorial, our algorithms are standard quadratic convex optimization problems with linear or quadratic constraints. We perform a number of experiments on benchmark datasets. R-SVM_μ^+ exhibits excellent feature selection performance compared to the state-of-the-art feature selection methods, such as L_1-norm and elastic-net based methods. R-SVM^+ achieves a significantly better classification performance compared to SVM and its other state-of-the-art variants. From the results it is clear that the incorporation of the radius, as a means to control the data spread, in the cost function has strong beneficial effects. } }
Endnote
%0 Conference Paper %T Convex formulations of radius-margin based Support Vector Machines %A Huyen Do %A Alexandros Kalousis %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-do13 %I PMLR %P 169--177 %U https://proceedings.mlr.press/v28/do13.html %V 28 %N 1 %X We consider Support Vector Machines (SVMs) learned together with linear transformations of the feature spaces on which they are applied. Under this scenario the radius of the smallest data enclosing sphere is no longer fixed. Therefore optimizing the SVM error bound by considering both the radius and the margin has the potential to deliver a tighter error bound. In this paper we present two novel algorithms: R-SVM_μ^+—a SVM radius-margin based feature selection algorithm, and R-SVM^+ — a metric learning-based SVM. We derive our algorithms by exploiting a new tighter approximation of the radius and a metric learning interpretation of SVM. Both optimize directly the radius-margin error bound using linear transformations. Unlike almost all existing radius-margin based SVM algorithms which are either non-convex or combinatorial, our algorithms are standard quadratic convex optimization problems with linear or quadratic constraints. We perform a number of experiments on benchmark datasets. R-SVM_μ^+ exhibits excellent feature selection performance compared to the state-of-the-art feature selection methods, such as L_1-norm and elastic-net based methods. R-SVM^+ achieves a significantly better classification performance compared to SVM and its other state-of-the-art variants. From the results it is clear that the incorporation of the radius, as a means to control the data spread, in the cost function has strong beneficial effects.
RIS
TY - CPAPER TI - Convex formulations of radius-margin based Support Vector Machines AU - Huyen Do AU - Alexandros Kalousis BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/02/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-do13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 1 SP - 169 EP - 177 L1 - http://proceedings.mlr.press/v28/do13.pdf UR - https://proceedings.mlr.press/v28/do13.html AB - We consider Support Vector Machines (SVMs) learned together with linear transformations of the feature spaces on which they are applied. Under this scenario the radius of the smallest data enclosing sphere is no longer fixed. Therefore optimizing the SVM error bound by considering both the radius and the margin has the potential to deliver a tighter error bound. In this paper we present two novel algorithms: R-SVM_μ^+—a SVM radius-margin based feature selection algorithm, and R-SVM^+ — a metric learning-based SVM. We derive our algorithms by exploiting a new tighter approximation of the radius and a metric learning interpretation of SVM. Both optimize directly the radius-margin error bound using linear transformations. Unlike almost all existing radius-margin based SVM algorithms which are either non-convex or combinatorial, our algorithms are standard quadratic convex optimization problems with linear or quadratic constraints. We perform a number of experiments on benchmark datasets. R-SVM_μ^+ exhibits excellent feature selection performance compared to the state-of-the-art feature selection methods, such as L_1-norm and elastic-net based methods. R-SVM^+ achieves a significantly better classification performance compared to SVM and its other state-of-the-art variants. From the results it is clear that the incorporation of the radius, as a means to control the data spread, in the cost function has strong beneficial effects. ER -
APA
Do, H. & Kalousis, A.. (2013). Convex formulations of radius-margin based Support Vector Machines. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(1):169-177 Available from https://proceedings.mlr.press/v28/do13.html.

Related Material