Learning with Bounded Instance and Label-dependent Label Noise

Jiacheng Cheng, Tongliang Liu, Kotagiri Ramamohanarao, Dacheng Tao
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:1789-1799, 2020.

Abstract

Instance- and Label-dependent label Noise (ILN) widely exists in real-world datasets but has been rarely studied. In this paper, we focus on Bounded Instance- and Label-dependent label Noise (BILN), a particular case of ILN where the label noise rates—the probabilities that the true labels of examples flip into the corrupted ones—have upper bound less than $1$. Specifically, we introduce the concept of distilled examples, i.e. examples whose labels are identical with the labels assigned for them by the Bayes optimal classifier, and prove that under certain conditions classifiers learnt on distilled examples will converge to the Bayes optimal classifier. Inspired by the idea of learning with distilled examples, we then propose a learning algorithm with theoretical guarantees for its robustness to BILN. At last, empirical evaluations on both synthetic and real-world datasets show effectiveness of our algorithm in learning with BILN.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-cheng20c, title = {Learning with Bounded Instance and Label-dependent Label Noise}, author = {Cheng, Jiacheng and Liu, Tongliang and Ramamohanarao, Kotagiri and Tao, Dacheng}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {1789--1799}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/cheng20c/cheng20c.pdf}, url = {https://proceedings.mlr.press/v119/cheng20c.html}, abstract = {Instance- and Label-dependent label Noise (ILN) widely exists in real-world datasets but has been rarely studied. In this paper, we focus on Bounded Instance- and Label-dependent label Noise (BILN), a particular case of ILN where the label noise rates—the probabilities that the true labels of examples flip into the corrupted ones—have upper bound less than $1$. Specifically, we introduce the concept of distilled examples, i.e. examples whose labels are identical with the labels assigned for them by the Bayes optimal classifier, and prove that under certain conditions classifiers learnt on distilled examples will converge to the Bayes optimal classifier. Inspired by the idea of learning with distilled examples, we then propose a learning algorithm with theoretical guarantees for its robustness to BILN. At last, empirical evaluations on both synthetic and real-world datasets show effectiveness of our algorithm in learning with BILN.} }
Endnote
%0 Conference Paper %T Learning with Bounded Instance and Label-dependent Label Noise %A Jiacheng Cheng %A Tongliang Liu %A Kotagiri Ramamohanarao %A Dacheng Tao %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-cheng20c %I PMLR %P 1789--1799 %U https://proceedings.mlr.press/v119/cheng20c.html %V 119 %X Instance- and Label-dependent label Noise (ILN) widely exists in real-world datasets but has been rarely studied. In this paper, we focus on Bounded Instance- and Label-dependent label Noise (BILN), a particular case of ILN where the label noise rates—the probabilities that the true labels of examples flip into the corrupted ones—have upper bound less than $1$. Specifically, we introduce the concept of distilled examples, i.e. examples whose labels are identical with the labels assigned for them by the Bayes optimal classifier, and prove that under certain conditions classifiers learnt on distilled examples will converge to the Bayes optimal classifier. Inspired by the idea of learning with distilled examples, we then propose a learning algorithm with theoretical guarantees for its robustness to BILN. At last, empirical evaluations on both synthetic and real-world datasets show effectiveness of our algorithm in learning with BILN.
APA
Cheng, J., Liu, T., Ramamohanarao, K. & Tao, D.. (2020). Learning with Bounded Instance and Label-dependent Label Noise. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:1789-1799 Available from https://proceedings.mlr.press/v119/cheng20c.html.

Related Material