Guess-Averse Loss Functions For Cost-Sensitive Multiclass Boosting

Oscar Beijbom, Mohammad Saberian, David Kriegman, Nuno Vasconcelos
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):586-594, 2014.

Abstract

Cost-sensitive multiclass classification has recently acquired significance in several applications, through the introduction of multiclass datasets with well-defined misclassification costs. The design of classification algorithms for this setting is considered. It is argued that the unreliable performance of current algorithms is due to the inability of the underlying loss functions to enforce a certain fundamental underlying property. This property, denoted guess-aversion, is that the loss should encourage correct classifications over the arbitrary guessing that ensues when all classes are equally scored by the classifier. While guess-aversion holds trivially for binary classification, this is not true in the multiclass setting. A new family of cost-sensitive guess-averse loss functions is derived, and used to design new cost-sensitive multiclass boosting algorithms, denoted GEL- and GLL-MCBoost. Extensive experiments demonstrate (1) the general importance of guess-aversion and (2) that the GLL loss function outperforms other loss functions for multiclass boosting.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-beijbom14, title = {Guess-Averse Loss Functions For Cost-Sensitive Multiclass Boosting}, author = {Beijbom, Oscar and Saberian, Mohammad and Kriegman, David and Vasconcelos, Nuno}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {586--594}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/beijbom14.pdf}, url = {https://proceedings.mlr.press/v32/beijbom14.html}, abstract = {Cost-sensitive multiclass classification has recently acquired significance in several applications, through the introduction of multiclass datasets with well-defined misclassification costs. The design of classification algorithms for this setting is considered. It is argued that the unreliable performance of current algorithms is due to the inability of the underlying loss functions to enforce a certain fundamental underlying property. This property, denoted guess-aversion, is that the loss should encourage correct classifications over the arbitrary guessing that ensues when all classes are equally scored by the classifier. While guess-aversion holds trivially for binary classification, this is not true in the multiclass setting. A new family of cost-sensitive guess-averse loss functions is derived, and used to design new cost-sensitive multiclass boosting algorithms, denoted GEL- and GLL-MCBoost. Extensive experiments demonstrate (1) the general importance of guess-aversion and (2) that the GLL loss function outperforms other loss functions for multiclass boosting.} }
Endnote
%0 Conference Paper %T Guess-Averse Loss Functions For Cost-Sensitive Multiclass Boosting %A Oscar Beijbom %A Mohammad Saberian %A David Kriegman %A Nuno Vasconcelos %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-beijbom14 %I PMLR %P 586--594 %U https://proceedings.mlr.press/v32/beijbom14.html %V 32 %N 2 %X Cost-sensitive multiclass classification has recently acquired significance in several applications, through the introduction of multiclass datasets with well-defined misclassification costs. The design of classification algorithms for this setting is considered. It is argued that the unreliable performance of current algorithms is due to the inability of the underlying loss functions to enforce a certain fundamental underlying property. This property, denoted guess-aversion, is that the loss should encourage correct classifications over the arbitrary guessing that ensues when all classes are equally scored by the classifier. While guess-aversion holds trivially for binary classification, this is not true in the multiclass setting. A new family of cost-sensitive guess-averse loss functions is derived, and used to design new cost-sensitive multiclass boosting algorithms, denoted GEL- and GLL-MCBoost. Extensive experiments demonstrate (1) the general importance of guess-aversion and (2) that the GLL loss function outperforms other loss functions for multiclass boosting.
RIS
TY - CPAPER TI - Guess-Averse Loss Functions For Cost-Sensitive Multiclass Boosting AU - Oscar Beijbom AU - Mohammad Saberian AU - David Kriegman AU - Nuno Vasconcelos BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-beijbom14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 586 EP - 594 L1 - http://proceedings.mlr.press/v32/beijbom14.pdf UR - https://proceedings.mlr.press/v32/beijbom14.html AB - Cost-sensitive multiclass classification has recently acquired significance in several applications, through the introduction of multiclass datasets with well-defined misclassification costs. The design of classification algorithms for this setting is considered. It is argued that the unreliable performance of current algorithms is due to the inability of the underlying loss functions to enforce a certain fundamental underlying property. This property, denoted guess-aversion, is that the loss should encourage correct classifications over the arbitrary guessing that ensues when all classes are equally scored by the classifier. While guess-aversion holds trivially for binary classification, this is not true in the multiclass setting. A new family of cost-sensitive guess-averse loss functions is derived, and used to design new cost-sensitive multiclass boosting algorithms, denoted GEL- and GLL-MCBoost. Extensive experiments demonstrate (1) the general importance of guess-aversion and (2) that the GLL loss function outperforms other loss functions for multiclass boosting. ER -
APA
Beijbom, O., Saberian, M., Kriegman, D. & Vasconcelos, N.. (2014). Guess-Averse Loss Functions For Cost-Sensitive Multiclass Boosting. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):586-594 Available from https://proceedings.mlr.press/v32/beijbom14.html.

Related Material