Learning Weighted Top-$k$ Support Vector Machine

Tsuyoshi Kato, Yoshihiro Hirohashi
Proceedings of The Eleventh Asian Conference on Machine Learning, PMLR 101:774-789, 2019.

Abstract

Nowadays, the top-$k$ accuracy is a major performance criterion when benchmarking multi-class classifier using datasets with a large number of categories. Top-$k$ multiclass SVM has been designed with the aim to minimize the empirical risk based on the top-$k$ accuracy. There already exist two SDCA-based algorithms to learn the top-$k$ SVM, enjoying several preferable properties for optimization, although both the algorithms suffer from two disadvantages. A weak point is that, since the design of the algorithms are specialized only to the top-$k$ hinge, their applicability to other variants is limited. The other disadvantage is that both the two algorithms cannot attain the optimal solution in most cases due to their theoritical imperfections. In this study, a weighted extension of top-$k$ SVM is considered, and novel learning algorithms based on the Frank-Wolfe algorithm is devised. The new learning algorithms possess all the favorable properties of SDCA as well as the applicability not only to the original top-$k$ SVM but also to the weighted extension. Geometrical convergence is achieved by smoothing the loss functions. Numerical simulations demonstrate that only the proposed Frank-Wolfe algorithms can converge to the optimum, in contrast with the failure of the two existing SDCA-based algorithms. Finally, our analytical results for these two studies are presented to shed light on the meaning of the solutions produced from their algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v101-kato19a, title = {Learning Weighted Top-$k$ Support Vector Machine}, author = {Kato, Tsuyoshi and Hirohashi, Yoshihiro}, booktitle = {Proceedings of The Eleventh Asian Conference on Machine Learning}, pages = {774--789}, year = {2019}, editor = {Lee, Wee Sun and Suzuki, Taiji}, volume = {101}, series = {Proceedings of Machine Learning Research}, month = {17--19 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v101/kato19a/kato19a.pdf}, url = {https://proceedings.mlr.press/v101/kato19a.html}, abstract = {Nowadays, the top-$k$ accuracy is a major performance criterion when benchmarking multi-class classifier using datasets with a large number of categories. Top-$k$ multiclass SVM has been designed with the aim to minimize the empirical risk based on the top-$k$ accuracy. There already exist two SDCA-based algorithms to learn the top-$k$ SVM, enjoying several preferable properties for optimization, although both the algorithms suffer from two disadvantages. A weak point is that, since the design of the algorithms are specialized only to the top-$k$ hinge, their applicability to other variants is limited. The other disadvantage is that both the two algorithms cannot attain the optimal solution in most cases due to their theoritical imperfections. In this study, a weighted extension of top-$k$ SVM is considered, and novel learning algorithms based on the Frank-Wolfe algorithm is devised. The new learning algorithms possess all the favorable properties of SDCA as well as the applicability not only to the original top-$k$ SVM but also to the weighted extension. Geometrical convergence is achieved by smoothing the loss functions. Numerical simulations demonstrate that only the proposed Frank-Wolfe algorithms can converge to the optimum, in contrast with the failure of the two existing SDCA-based algorithms. Finally, our analytical results for these two studies are presented to shed light on the meaning of the solutions produced from their algorithms. } }
Endnote
%0 Conference Paper %T Learning Weighted Top-$k$ Support Vector Machine %A Tsuyoshi Kato %A Yoshihiro Hirohashi %B Proceedings of The Eleventh Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Wee Sun Lee %E Taiji Suzuki %F pmlr-v101-kato19a %I PMLR %P 774--789 %U https://proceedings.mlr.press/v101/kato19a.html %V 101 %X Nowadays, the top-$k$ accuracy is a major performance criterion when benchmarking multi-class classifier using datasets with a large number of categories. Top-$k$ multiclass SVM has been designed with the aim to minimize the empirical risk based on the top-$k$ accuracy. There already exist two SDCA-based algorithms to learn the top-$k$ SVM, enjoying several preferable properties for optimization, although both the algorithms suffer from two disadvantages. A weak point is that, since the design of the algorithms are specialized only to the top-$k$ hinge, their applicability to other variants is limited. The other disadvantage is that both the two algorithms cannot attain the optimal solution in most cases due to their theoritical imperfections. In this study, a weighted extension of top-$k$ SVM is considered, and novel learning algorithms based on the Frank-Wolfe algorithm is devised. The new learning algorithms possess all the favorable properties of SDCA as well as the applicability not only to the original top-$k$ SVM but also to the weighted extension. Geometrical convergence is achieved by smoothing the loss functions. Numerical simulations demonstrate that only the proposed Frank-Wolfe algorithms can converge to the optimum, in contrast with the failure of the two existing SDCA-based algorithms. Finally, our analytical results for these two studies are presented to shed light on the meaning of the solutions produced from their algorithms.
APA
Kato, T. & Hirohashi, Y.. (2019). Learning Weighted Top-$k$ Support Vector Machine. Proceedings of The Eleventh Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 101:774-789 Available from https://proceedings.mlr.press/v101/kato19a.html.

Related Material