Disambiguation of Weak Supervision leading to Exponential Convergence rates

Vivien A Cabannnes, Francis Bach, Alessandro Rudi
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:1147-1157, 2021.

Abstract

Machine learning approached through supervised learning requires expensive annotation of data. This motivates weakly supervised learning, where data are annotated with incomplete yet discriminative information. In this paper, we focus on partial labelling, an instance of weak supervision where, from a given input, we are given a set of potential targets. We review a disambiguation principle to recover full supervision from weak supervision, and propose an empirical disambiguation algorithm. We prove exponential convergence rates of our algorithm under classical learnability assumptions, and we illustrate the usefulness of our method on practical examples.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-cabannnes21a, title = {Disambiguation of Weak Supervision leading to Exponential Convergence rates}, author = {Cabannnes, Vivien A and Bach, Francis and Rudi, Alessandro}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {1147--1157}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/cabannnes21a/cabannnes21a.pdf}, url = {https://proceedings.mlr.press/v139/cabannnes21a.html}, abstract = {Machine learning approached through supervised learning requires expensive annotation of data. This motivates weakly supervised learning, where data are annotated with incomplete yet discriminative information. In this paper, we focus on partial labelling, an instance of weak supervision where, from a given input, we are given a set of potential targets. We review a disambiguation principle to recover full supervision from weak supervision, and propose an empirical disambiguation algorithm. We prove exponential convergence rates of our algorithm under classical learnability assumptions, and we illustrate the usefulness of our method on practical examples.} }
Endnote
%0 Conference Paper %T Disambiguation of Weak Supervision leading to Exponential Convergence rates %A Vivien A Cabannnes %A Francis Bach %A Alessandro Rudi %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-cabannnes21a %I PMLR %P 1147--1157 %U https://proceedings.mlr.press/v139/cabannnes21a.html %V 139 %X Machine learning approached through supervised learning requires expensive annotation of data. This motivates weakly supervised learning, where data are annotated with incomplete yet discriminative information. In this paper, we focus on partial labelling, an instance of weak supervision where, from a given input, we are given a set of potential targets. We review a disambiguation principle to recover full supervision from weak supervision, and propose an empirical disambiguation algorithm. We prove exponential convergence rates of our algorithm under classical learnability assumptions, and we illustrate the usefulness of our method on practical examples.
APA
Cabannnes, V.A., Bach, F. & Rudi, A.. (2021). Disambiguation of Weak Supervision leading to Exponential Convergence rates. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:1147-1157 Available from https://proceedings.mlr.press/v139/cabannnes21a.html.

Related Material