Controlling Imbalanced Error in Deep Learning with the Log Bilinear Loss

Yehezkel S. Resheff, Amit Mandelbom, Daphna Weinshall
Proceedings of the First International Workshop on Learning with Imbalanced Domains: Theory and Applications, PMLR 74:141-151, 2017.

Abstract

Deep learning has become the method of choice for many machine learning tasks in recent years, and especially for multi-class classification. The most common loss function used in this context is the cross-entropy loss. While this function is insensitive to the identity of the assigned class in the case of misclassification, in practice it very common to have imbalanced sensitivity to error, meaning some wrong assignments are much worse than others. Here we present the bilinear-loss (and related log-bilinear-loss) which differentially penalizes the different wrong assignments of the model. We thoroughly test the proposed method using standard models and benchmark image datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v74-resheff17a, title = {Controlling Imbalanced Error in Deep Learning with the Log Bilinear Loss}, author = {Resheff, Yehezkel S. and Mandelbom, Amit and Weinshall, Daphna}, booktitle = {Proceedings of the First International Workshop on Learning with Imbalanced Domains: Theory and Applications}, pages = {141--151}, year = {2017}, editor = {Luís Torgo, Paula Branco and Moniz, Nuno}, volume = {74}, series = {Proceedings of Machine Learning Research}, month = {22 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v74/resheff17a/resheff17a.pdf}, url = {https://proceedings.mlr.press/v74/resheff17a.html}, abstract = {Deep learning has become the method of choice for many machine learning tasks in recent years, and especially for multi-class classification. The most common loss function used in this context is the cross-entropy loss. While this function is insensitive to the identity of the assigned class in the case of misclassification, in practice it very common to have imbalanced sensitivity to error, meaning some wrong assignments are much worse than others. Here we present the bilinear-loss (and related log-bilinear-loss) which differentially penalizes the different wrong assignments of the model. We thoroughly test the proposed method using standard models and benchmark image datasets.} }
Endnote
%0 Conference Paper %T Controlling Imbalanced Error in Deep Learning with the Log Bilinear Loss %A Yehezkel S. Resheff %A Amit Mandelbom %A Daphna Weinshall %B Proceedings of the First International Workshop on Learning with Imbalanced Domains: Theory and Applications %C Proceedings of Machine Learning Research %D 2017 %E Paula Branco Luís Torgo %E Nuno Moniz %F pmlr-v74-resheff17a %I PMLR %P 141--151 %U https://proceedings.mlr.press/v74/resheff17a.html %V 74 %X Deep learning has become the method of choice for many machine learning tasks in recent years, and especially for multi-class classification. The most common loss function used in this context is the cross-entropy loss. While this function is insensitive to the identity of the assigned class in the case of misclassification, in practice it very common to have imbalanced sensitivity to error, meaning some wrong assignments are much worse than others. Here we present the bilinear-loss (and related log-bilinear-loss) which differentially penalizes the different wrong assignments of the model. We thoroughly test the proposed method using standard models and benchmark image datasets.
APA
Resheff, Y.S., Mandelbom, A. & Weinshall, D.. (2017). Controlling Imbalanced Error in Deep Learning with the Log Bilinear Loss. Proceedings of the First International Workshop on Learning with Imbalanced Domains: Theory and Applications, in Proceedings of Machine Learning Research 74:141-151 Available from https://proceedings.mlr.press/v74/resheff17a.html.

Related Material