Proper Losses for Learning with Example-Dependent Costs

Alexander Hepburn, Ryan McConville, Raúl Santos-Rodríguezo, Jesús Cid-Sueiro, Dario García-García
Proceedings of the Second International Workshop on Learning with Imbalanced Domains: Theory and Applications, PMLR 94:52-66, 2018.

Abstract

We study the design of cost-sensitive learning algorithms with example-dependent costs, when cost matrices for each example are given both during training and test. The approach is based on the empirical risk minimization framework, where we replace the standard loss function by a combination of surrogate losses belonging to the family of proper losses. The actual contribution of each example to the risk is then given by a loss that depends on the cost matrix for the specific example. We then evaluate the use of such example-dependent loss functions in real-world binary and multiclass problems, namely credit risk assessment and musical genre classification. Using different neural network architectures, we show that with the appropriate choice of the example-dependent losses, we can outperform conventional cost-sensitive methods in terms of total cost, making a more efficient use of cost information during training and test as compared to existing discriminative approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v94-hepburn18a, title = {Proper Losses for Learning with Example-Dependent Costs}, author = {Hepburn, Alexander and McConville, Ryan and Santos-Rodr{\'i}guezo, Ra{\'u}l and Cid-Sueiro, Jes{\'u}s and Garc{\'i}a-Garc{\'i}a, Dario}, booktitle = {Proceedings of the Second International Workshop on Learning with Imbalanced Domains: Theory and Applications}, pages = {52--66}, year = {2018}, editor = {Torgo, Luís and Matwin, Stan and Japkowicz, Nathalie and Krawczyk, Bartosz and Moniz, Nuno and Branco, Paula}, volume = {94}, series = {Proceedings of Machine Learning Research}, month = {10 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v94/hepburn18a/hepburn18a.pdf}, url = {https://proceedings.mlr.press/v94/hepburn18a.html}, abstract = {We study the design of cost-sensitive learning algorithms with example-dependent costs, when cost matrices for each example are given both during training and test. The approach is based on the empirical risk minimization framework, where we replace the standard loss function by a combination of surrogate losses belonging to the family of proper losses. The actual contribution of each example to the risk is then given by a loss that depends on the cost matrix for the specific example. We then evaluate the use of such example-dependent loss functions in real-world binary and multiclass problems, namely credit risk assessment and musical genre classification. Using different neural network architectures, we show that with the appropriate choice of the example-dependent losses, we can outperform conventional cost-sensitive methods in terms of total cost, making a more efficient use of cost information during training and test as compared to existing discriminative approaches.} }
Endnote
%0 Conference Paper %T Proper Losses for Learning with Example-Dependent Costs %A Alexander Hepburn %A Ryan McConville %A Raúl Santos-Rodríguezo %A Jesús Cid-Sueiro %A Dario García-García %B Proceedings of the Second International Workshop on Learning with Imbalanced Domains: Theory and Applications %C Proceedings of Machine Learning Research %D 2018 %E Luís Torgo %E Stan Matwin %E Nathalie Japkowicz %E Bartosz Krawczyk %E Nuno Moniz %E Paula Branco %F pmlr-v94-hepburn18a %I PMLR %P 52--66 %U https://proceedings.mlr.press/v94/hepburn18a.html %V 94 %X We study the design of cost-sensitive learning algorithms with example-dependent costs, when cost matrices for each example are given both during training and test. The approach is based on the empirical risk minimization framework, where we replace the standard loss function by a combination of surrogate losses belonging to the family of proper losses. The actual contribution of each example to the risk is then given by a loss that depends on the cost matrix for the specific example. We then evaluate the use of such example-dependent loss functions in real-world binary and multiclass problems, namely credit risk assessment and musical genre classification. Using different neural network architectures, we show that with the appropriate choice of the example-dependent losses, we can outperform conventional cost-sensitive methods in terms of total cost, making a more efficient use of cost information during training and test as compared to existing discriminative approaches.
APA
Hepburn, A., McConville, R., Santos-Rodríguezo, R., Cid-Sueiro, J. & García-García, D.. (2018). Proper Losses for Learning with Example-Dependent Costs. Proceedings of the Second International Workshop on Learning with Imbalanced Domains: Theory and Applications, in Proceedings of Machine Learning Research 94:52-66 Available from https://proceedings.mlr.press/v94/hepburn18a.html.

Related Material