On Symmetric Losses for Learning from Corrupted Labels

Nontawat Charoenphakdee, Jongyeong Lee, Masashi Sugiyama
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:961-970, 2019.

Abstract

This paper aims to provide a better understanding of a symmetric loss. First, we emphasize that using a symmetric loss is advantageous in the balanced error rate (BER) minimization and area under the receiver operating characteristic curve (AUC) maximization from corrupted labels. Second, we prove general theoretical properties of symmetric losses, including a classification-calibration condition, excess risk bound, conditional risk minimizer, and AUC-consistency condition. Third, since all nonnegative symmetric losses are non-convex, we propose a convex barrier hinge loss that benefits significantly from the symmetric condition, although it is not symmetric everywhere. Finally, we conduct experiments to validate the relevance of the symmetric condition.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-charoenphakdee19a, title = {On Symmetric Losses for Learning from Corrupted Labels}, author = {Charoenphakdee, Nontawat and Lee, Jongyeong and Sugiyama, Masashi}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {961--970}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/charoenphakdee19a/charoenphakdee19a.pdf}, url = {https://proceedings.mlr.press/v97/charoenphakdee19a.html}, abstract = {This paper aims to provide a better understanding of a symmetric loss. First, we emphasize that using a symmetric loss is advantageous in the balanced error rate (BER) minimization and area under the receiver operating characteristic curve (AUC) maximization from corrupted labels. Second, we prove general theoretical properties of symmetric losses, including a classification-calibration condition, excess risk bound, conditional risk minimizer, and AUC-consistency condition. Third, since all nonnegative symmetric losses are non-convex, we propose a convex barrier hinge loss that benefits significantly from the symmetric condition, although it is not symmetric everywhere. Finally, we conduct experiments to validate the relevance of the symmetric condition.} }
Endnote
%0 Conference Paper %T On Symmetric Losses for Learning from Corrupted Labels %A Nontawat Charoenphakdee %A Jongyeong Lee %A Masashi Sugiyama %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-charoenphakdee19a %I PMLR %P 961--970 %U https://proceedings.mlr.press/v97/charoenphakdee19a.html %V 97 %X This paper aims to provide a better understanding of a symmetric loss. First, we emphasize that using a symmetric loss is advantageous in the balanced error rate (BER) minimization and area under the receiver operating characteristic curve (AUC) maximization from corrupted labels. Second, we prove general theoretical properties of symmetric losses, including a classification-calibration condition, excess risk bound, conditional risk minimizer, and AUC-consistency condition. Third, since all nonnegative symmetric losses are non-convex, we propose a convex barrier hinge loss that benefits significantly from the symmetric condition, although it is not symmetric everywhere. Finally, we conduct experiments to validate the relevance of the symmetric condition.
APA
Charoenphakdee, N., Lee, J. & Sugiyama, M.. (2019). On Symmetric Losses for Learning from Corrupted Labels. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:961-970 Available from https://proceedings.mlr.press/v97/charoenphakdee19a.html.

Related Material