Consistent Complementary-Label Learning via Order-Preserving Losses

Shuqi Liu, Yuzhou Cao, Qiaozhen Zhang, Lei Feng, Bo An
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:8734-8748, 2023.

Abstract

In contrast to ordinary supervised classification tasks that require massive data with high-quality labels, complementary-label learning (CLL) deals with the weakly-supervised learning scenario where each instance is equipped with a complementary label, which specifies a class the instance does not belong to. However, most of the existing statistically consistent CLL methods suffer from overfitting intrinsically, due to the negative empirical risk issue. In this paper, we aim to propose overfitting-resistant and theoretically grounded methods for CLL. Considering the unique property of the distribution of complementarily labeled samples, we provide a risk estimator via order-preserving losses, which are naturally non-negative and thus can avoid overfitting caused by negative terms in risk estimators. Moreover, we provide classifier-consistency analysis and statistical guarantee for this estimator. Furthermore, we provide a weighed version of the proposed risk estimator to further enhance its generalization ability and prove its statistical consistency. Experiments on benchmark datasets demonstrate the effectiveness of our proposed methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-liu23g, title = {Consistent Complementary-Label Learning via Order-Preserving Losses}, author = {Liu, Shuqi and Cao, Yuzhou and Zhang, Qiaozhen and Feng, Lei and An, Bo}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {8734--8748}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/liu23g/liu23g.pdf}, url = {https://proceedings.mlr.press/v206/liu23g.html}, abstract = {In contrast to ordinary supervised classification tasks that require massive data with high-quality labels, complementary-label learning (CLL) deals with the weakly-supervised learning scenario where each instance is equipped with a complementary label, which specifies a class the instance does not belong to. However, most of the existing statistically consistent CLL methods suffer from overfitting intrinsically, due to the negative empirical risk issue. In this paper, we aim to propose overfitting-resistant and theoretically grounded methods for CLL. Considering the unique property of the distribution of complementarily labeled samples, we provide a risk estimator via order-preserving losses, which are naturally non-negative and thus can avoid overfitting caused by negative terms in risk estimators. Moreover, we provide classifier-consistency analysis and statistical guarantee for this estimator. Furthermore, we provide a weighed version of the proposed risk estimator to further enhance its generalization ability and prove its statistical consistency. Experiments on benchmark datasets demonstrate the effectiveness of our proposed methods.} }
Endnote
%0 Conference Paper %T Consistent Complementary-Label Learning via Order-Preserving Losses %A Shuqi Liu %A Yuzhou Cao %A Qiaozhen Zhang %A Lei Feng %A Bo An %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-liu23g %I PMLR %P 8734--8748 %U https://proceedings.mlr.press/v206/liu23g.html %V 206 %X In contrast to ordinary supervised classification tasks that require massive data with high-quality labels, complementary-label learning (CLL) deals with the weakly-supervised learning scenario where each instance is equipped with a complementary label, which specifies a class the instance does not belong to. However, most of the existing statistically consistent CLL methods suffer from overfitting intrinsically, due to the negative empirical risk issue. In this paper, we aim to propose overfitting-resistant and theoretically grounded methods for CLL. Considering the unique property of the distribution of complementarily labeled samples, we provide a risk estimator via order-preserving losses, which are naturally non-negative and thus can avoid overfitting caused by negative terms in risk estimators. Moreover, we provide classifier-consistency analysis and statistical guarantee for this estimator. Furthermore, we provide a weighed version of the proposed risk estimator to further enhance its generalization ability and prove its statistical consistency. Experiments on benchmark datasets demonstrate the effectiveness of our proposed methods.
APA
Liu, S., Cao, Y., Zhang, Q., Feng, L. & An, B.. (2023). Consistent Complementary-Label Learning via Order-Preserving Losses. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:8734-8748 Available from https://proceedings.mlr.press/v206/liu23g.html.

Related Material