Complementary-Label Learning for Arbitrary Losses and Models

Takashi Ishida, Gang Niu, Aditya Menon, Masashi Sugiyama
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:2971-2980, 2019.

Abstract

In contrast to the standard classification paradigm where the true class is given to each training pattern, complementary-label learning only uses training patterns each equipped with a complementary label, which only specifies one of the classes that the pattern does not belong to. The goal of this paper is to derive a novel framework of complementary-label learning with an unbiased estimator of the classification risk, for arbitrary losses and models—all existing methods have failed to achieve this goal. Not only is this beneficial for the learning stage, it also makes model/hyper-parameter selection (through cross-validation) possible without the need of any ordinarily labeled validation data, while using any linear/non-linear models or convex/non-convex loss functions. We further improve the risk estimator by a non-negative correction and gradient ascent trick, and demonstrate its superiority through experiments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-ishida19a, title = {Complementary-Label Learning for Arbitrary Losses and Models}, author = {Ishida, Takashi and Niu, Gang and Menon, Aditya and Sugiyama, Masashi}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {2971--2980}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/ishida19a/ishida19a.pdf}, url = {https://proceedings.mlr.press/v97/ishida19a.html}, abstract = {In contrast to the standard classification paradigm where the true class is given to each training pattern, complementary-label learning only uses training patterns each equipped with a complementary label, which only specifies one of the classes that the pattern does not belong to. The goal of this paper is to derive a novel framework of complementary-label learning with an unbiased estimator of the classification risk, for arbitrary losses and models—all existing methods have failed to achieve this goal. Not only is this beneficial for the learning stage, it also makes model/hyper-parameter selection (through cross-validation) possible without the need of any ordinarily labeled validation data, while using any linear/non-linear models or convex/non-convex loss functions. We further improve the risk estimator by a non-negative correction and gradient ascent trick, and demonstrate its superiority through experiments.} }
Endnote
%0 Conference Paper %T Complementary-Label Learning for Arbitrary Losses and Models %A Takashi Ishida %A Gang Niu %A Aditya Menon %A Masashi Sugiyama %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-ishida19a %I PMLR %P 2971--2980 %U https://proceedings.mlr.press/v97/ishida19a.html %V 97 %X In contrast to the standard classification paradigm where the true class is given to each training pattern, complementary-label learning only uses training patterns each equipped with a complementary label, which only specifies one of the classes that the pattern does not belong to. The goal of this paper is to derive a novel framework of complementary-label learning with an unbiased estimator of the classification risk, for arbitrary losses and models—all existing methods have failed to achieve this goal. Not only is this beneficial for the learning stage, it also makes model/hyper-parameter selection (through cross-validation) possible without the need of any ordinarily labeled validation data, while using any linear/non-linear models or convex/non-convex loss functions. We further improve the risk estimator by a non-negative correction and gradient ascent trick, and demonstrate its superiority through experiments.
APA
Ishida, T., Niu, G., Menon, A. & Sugiyama, M.. (2019). Complementary-Label Learning for Arbitrary Losses and Models. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:2971-2980 Available from https://proceedings.mlr.press/v97/ishida19a.html.

Related Material