Asymmetric Loss Functions for Learning with Noisy Labels

Xiong Zhou, Xianming Liu, Junjun Jiang, Xin Gao, Xiangyang Ji
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:12846-12856, 2021.

Abstract

Robust loss functions are essential for training deep neural networks with better generalization power in the presence of noisy labels. Symmetric loss functions are confirmed to be robust to label noise. However, the symmetric condition is overly restrictive. In this work, we propose a new class of loss functions, namely asymmetric loss functions, which are robust to learning from noisy labels for arbitrary noise type. Subsequently, we investigate general theoretical properties of asymmetric loss functions, including classification-calibration, excess risk bound, and noise-tolerance. Meanwhile, we introduce the asymmetry ratio to measure the asymmetry of a loss function, and the empirical results show that a higher ratio will provide better robustness. Moreover, we modify several common loss functions, and establish the necessary and sufficient conditions for them to be asymmetric. Experiments on benchmark datasets demonstrate that asymmetric loss functions can outperform state-of-the-art methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-zhou21f, title = {Asymmetric Loss Functions for Learning with Noisy Labels}, author = {Zhou, Xiong and Liu, Xianming and Jiang, Junjun and Gao, Xin and Ji, Xiangyang}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {12846--12856}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/zhou21f/zhou21f.pdf}, url = {https://proceedings.mlr.press/v139/zhou21f.html}, abstract = {Robust loss functions are essential for training deep neural networks with better generalization power in the presence of noisy labels. Symmetric loss functions are confirmed to be robust to label noise. However, the symmetric condition is overly restrictive. In this work, we propose a new class of loss functions, namely asymmetric loss functions, which are robust to learning from noisy labels for arbitrary noise type. Subsequently, we investigate general theoretical properties of asymmetric loss functions, including classification-calibration, excess risk bound, and noise-tolerance. Meanwhile, we introduce the asymmetry ratio to measure the asymmetry of a loss function, and the empirical results show that a higher ratio will provide better robustness. Moreover, we modify several common loss functions, and establish the necessary and sufficient conditions for them to be asymmetric. Experiments on benchmark datasets demonstrate that asymmetric loss functions can outperform state-of-the-art methods.} }
Endnote
%0 Conference Paper %T Asymmetric Loss Functions for Learning with Noisy Labels %A Xiong Zhou %A Xianming Liu %A Junjun Jiang %A Xin Gao %A Xiangyang Ji %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-zhou21f %I PMLR %P 12846--12856 %U https://proceedings.mlr.press/v139/zhou21f.html %V 139 %X Robust loss functions are essential for training deep neural networks with better generalization power in the presence of noisy labels. Symmetric loss functions are confirmed to be robust to label noise. However, the symmetric condition is overly restrictive. In this work, we propose a new class of loss functions, namely asymmetric loss functions, which are robust to learning from noisy labels for arbitrary noise type. Subsequently, we investigate general theoretical properties of asymmetric loss functions, including classification-calibration, excess risk bound, and noise-tolerance. Meanwhile, we introduce the asymmetry ratio to measure the asymmetry of a loss function, and the empirical results show that a higher ratio will provide better robustness. Moreover, we modify several common loss functions, and establish the necessary and sufficient conditions for them to be asymmetric. Experiments on benchmark datasets demonstrate that asymmetric loss functions can outperform state-of-the-art methods.
APA
Zhou, X., Liu, X., Jiang, J., Gao, X. & Ji, X.. (2021). Asymmetric Loss Functions for Learning with Noisy Labels. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:12846-12856 Available from https://proceedings.mlr.press/v139/zhou21f.html.

Related Material