Cost-sensitive Multiclass Classification Risk Bounds

Bernardo Ávila Pires, Csaba Szepesvari, Mohammad Ghavamzadeh
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):1391-1399, 2013.

Abstract

A commonly used approach to multiclass classification is to replace the 0-1 loss with a convex surrogate so as to make empirical risk minimization computationally tractable. Previous work has uncovered sufficient and necessary conditions for the consistency of the resulting procedures. In this paper, we strengthen these results by showing how the 0-1 excess loss of a predictor can be upper bounded as a function of the excess loss of the predictor measured using the convex surrogate. The bound is developed for the case of cost-sensitive multiclass classification and a convex surrogate loss that goes back to the work of Lee, Lin and Wahba. The bounds are as easy to calculate as in binary classification. Furthermore, we also show that our analysis extends to the analysis of the recently introduced “Simplex Coding” scheme.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-avilapires13, title = {Cost-sensitive Multiclass Classification Risk Bounds}, author = {Ávila Pires, Bernardo and Szepesvari, Csaba and Ghavamzadeh, Mohammad}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {1391--1399}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/avilapires13.pdf}, url = {https://proceedings.mlr.press/v28/avilapires13.html}, abstract = {A commonly used approach to multiclass classification is to replace the 0-1 loss with a convex surrogate so as to make empirical risk minimization computationally tractable. Previous work has uncovered sufficient and necessary conditions for the consistency of the resulting procedures. In this paper, we strengthen these results by showing how the 0-1 excess loss of a predictor can be upper bounded as a function of the excess loss of the predictor measured using the convex surrogate. The bound is developed for the case of cost-sensitive multiclass classification and a convex surrogate loss that goes back to the work of Lee, Lin and Wahba. The bounds are as easy to calculate as in binary classification. Furthermore, we also show that our analysis extends to the analysis of the recently introduced “Simplex Coding” scheme.} }
Endnote
%0 Conference Paper %T Cost-sensitive Multiclass Classification Risk Bounds %A Bernardo Ávila Pires %A Csaba Szepesvari %A Mohammad Ghavamzadeh %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-avilapires13 %I PMLR %P 1391--1399 %U https://proceedings.mlr.press/v28/avilapires13.html %V 28 %N 3 %X A commonly used approach to multiclass classification is to replace the 0-1 loss with a convex surrogate so as to make empirical risk minimization computationally tractable. Previous work has uncovered sufficient and necessary conditions for the consistency of the resulting procedures. In this paper, we strengthen these results by showing how the 0-1 excess loss of a predictor can be upper bounded as a function of the excess loss of the predictor measured using the convex surrogate. The bound is developed for the case of cost-sensitive multiclass classification and a convex surrogate loss that goes back to the work of Lee, Lin and Wahba. The bounds are as easy to calculate as in binary classification. Furthermore, we also show that our analysis extends to the analysis of the recently introduced “Simplex Coding” scheme.
RIS
TY - CPAPER TI - Cost-sensitive Multiclass Classification Risk Bounds AU - Bernardo Ávila Pires AU - Csaba Szepesvari AU - Mohammad Ghavamzadeh BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-avilapires13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 1391 EP - 1399 L1 - http://proceedings.mlr.press/v28/avilapires13.pdf UR - https://proceedings.mlr.press/v28/avilapires13.html AB - A commonly used approach to multiclass classification is to replace the 0-1 loss with a convex surrogate so as to make empirical risk minimization computationally tractable. Previous work has uncovered sufficient and necessary conditions for the consistency of the resulting procedures. In this paper, we strengthen these results by showing how the 0-1 excess loss of a predictor can be upper bounded as a function of the excess loss of the predictor measured using the convex surrogate. The bound is developed for the case of cost-sensitive multiclass classification and a convex surrogate loss that goes back to the work of Lee, Lin and Wahba. The bounds are as easy to calculate as in binary classification. Furthermore, we also show that our analysis extends to the analysis of the recently introduced “Simplex Coding” scheme. ER -
APA
Ávila Pires, B., Szepesvari, C. & Ghavamzadeh, M.. (2013). Cost-sensitive Multiclass Classification Risk Bounds. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):1391-1399 Available from https://proceedings.mlr.press/v28/avilapires13.html.

Related Material